Converting JSON to CSV

A long time ago (I’m slightly embarrassed to admit) I mentioned that I wrote a script to flatten JSON to CSV and I promised to share it. I didn’t fulfill this promise till today. The reason being that the script was very rough and I wanted to sharpen it before letting it loose on the interwebs. I still didn’t get around to properly clean it up and probably will not for a couple more months or more. However, I decided today to release it as is. I’m sure some will find it useful enough and maybe be willing to tidy some of it up.

So here’s the story. There are plenty of tools out there to convert from CSV to JSON; e.g. Mr. Data Converter, CSV to JSON, csv2couch. However, converting from JSON, a flexible self-describing format, to CSV, a much more rigid format, is not as simple. I found some attempts (like jackson-dataformat-csv) but these require a predefined schema. This did not really suit my needs as I do not want to define a schema for each JSON file I need to convert. I just know that I have a JSON with a reliably consistent structure and I want to convert it to CSV/TSV for another program that can only accept such format.

So I developed my code with such requirements in mind. So be careful: inconsistent JSON structure will result in wrong CSV. The code parses the JSON twice: once to discover the schema, and again to convert.

Here it is. Enjoy, and be gentle 🙂

Advertisements

2 thoughts on “Converting JSON to CSV

  1. You can fully automate the JSON to CSV conversion process with Flexter our free JSON converter. Flexter is an ETL tool for JSON and XML. It automates the conversion of JSON to a database, text, or Hadoop. We have written up a blog post (including a video) that shows how easy it is to automatically convert your JSON files to CSV https://sonra.io/2018/03/16/converting-fhir-json-csv-flexter/. No manual coding needed. In this blog post we convert FHIR JSON, which is based on an industry data standard in healtcare.

    When you convert your JSON files it also provides a diagram of the target model and the data lineage. It can handle JSON of any complexity.

    Try for yourself: https://jsonconverter.sonra.io/

  2. I normally use 4DIQ’s ‘Flow’ framework for my conversions to / from JSON. Never have had any issues with it, even on really large files, and it is convenient in how it is able to interface with anything I use.

    No code, just literally “import” and “export” in the ‘Actions’ bar of the program. Turns your file into generic data and then seamlessly exports it out cleanly into any format you could ever really need.

    Here’s a blog post I found out about Flow from, I highly recommend it: https://flow-analytics.com/blog/how-to-normalize-json-using-flow-analytics

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s