bash: shell table output to json
This post presents a Python script that converts tabular command-line output into a more versatile JSON format. This allows for easier data manipulation using tools like jq, as an alternative to complex text-processing pipelines in bash.
You know that sometimes it would be really great to format a shell output to a more versatile format like JSON or YAML that you can process with jq instead of writing long pipes with text-processing.
Ah yeah, you could use python instead of bash ;)
Before
$ virsh net-list
Name State Autostart Persistent
----------------------------------------------------------
default active yes yes
After
$ virsh net-list | table-to-json
[
{
"autostart": "yes",
"name": "default",
"persistent": "yes",
"state": "active"
}
]
$ virsh net-list | bin/table-to-json | jq -r ".[0].name"
default
table-to-json
#!/usr/bin/env python
import sys
import re
import json
def parse_line(line):
if line.find("\t") == -1:
line = re.sub(r'\s+', '\t', line , flags=re.IGNORECASE)
line = re.sub(r'^\s+', '', line , flags=re.IGNORECASE)
line = re.sub(r'\s+$', '', line , flags=re.IGNORECASE)
return [x for x in line.split("\t") if x]
lastparts = []
columns = None
data = []
for line in sys.stdin:
parts = parse_line(line)
if len(parts)>0:
if len(parts)>0 and parts[0].startswith("-"):
columns=[x.lower() for x in lastparts]
elif len(parts)>0 and columns:
data.append(dict(zip(columns, parts)))
lastparts = parts
print(json.dumps(data, sort_keys=True, indent=4))
UPDATE: Maybe the script should use asciitable.