Show HN: Convert Large CSV/XLSX to JSON or XML in Browser

csvforge.com

41 points by Botlabs 2 days ago

Hello HN, I'm excited to share a project I've been working on: A simple, fast way to process huge CSV and XLSX files directly in your browser and export them as clean JSON or XML

Here's a few things that makes this converter different: - runs in the browser - all parsing and conversion is client side can handle data any size data - automatically detects delimiters, encodings, and data types as it parses - Live preview with column renaming, search/replace, and data cleanup - Export to JSON or XML — clean, structured output that can be used for API or Databases

backstory: I built this tool for myself. I work with massive CSV and TXT files, some over 10GB, and opening them in Excel would freeze my laptop, some of the online converters only limits to a certain size, so I started learning Python and pandas but ended up wasting so much time trying different delimiters or fixing badly structured data just to make it usable, and I thought this would be a really fun project to build

I'd love some feedback. Thank you

URL: https://csvforge.com

shubhamjain 2 days ago

I don't get it. Are JSON and XML files more friendly to import vs CSV files? I always assumed CSVs were the standard. Any reasons to prefer structured formats?

Shameless plug: I am working on a similar problem of Excel not being a great tool for large datasets. My desktop app[1] lets you import raw data files and query them using SQL. (The website needs to be updated, the app looks much better than the current screenshots).

[1]: https://textquery.app

  • Botlabs a day ago

    yes they are a lot easier to work with when inserting into the database

sverhagen 2 days ago

"Runs in the browser" and "client side" isn't as much of a selling point to me as it's made out to be. It's a claim that I can't really validate until it's too late. If it's a commercial service I'm going to have to pay for, then maybe you should go all the way in gaining my trust with whatever safeguards it takes, so that I no longer care if I upload my data to your server or not.

  • strogonoff 2 days ago

    There’s a cheap trick to make sure a website that claims to do everything client-side actually does everything client-side:

    1. Open the site in an incognito window.

    2. Turn off your Internet.

    3. Do what you’ve got to do.

    4. Close browser window.

    As a bonus, and this makes it better than just flipping the offline switch in developer tools, if you turn off Internet in a way that keeps the browser thinking it’s online, you can also peek at whether any network requests are made (for pathological cases where the app does everything locally but phones home anyway).

  • Botlabs a day ago

    Sure, but you can validate it dev tools exist for a reason. Honestly, I just can’t afford the storage costs if users are uploading 50GB+ CSVs. It’d be a huge strain on any server, not to mention painfully slow for users. Running everything client side was the easiest and most practical way to build this MVP at least for me thanks for the feedback

  • hahn-kev 2 days ago

    Yeah I really wish there was a way for this to be enforced by the browser that the end user could trust. It would have to be a standard, but outside of opening dev tools and toggling offline mode there's no way to be sure.

    The funny thing is that it feels safer to download a desktop app and give it the same data even though it's usually much harder to validate if it's shipping your data somewhere else.

  • rustc 2 days ago

    > then maybe you should go all the way in gaining my trust with whatever safeguards it takes

    What kind of safeguards are possible with a web app?

    • sverhagen a day ago

      I think this comes down to legally-enforceable contracts with some teeth. A lot of business seem okay to trust Google's cloud products, or Microsoft's? I think as private person with limited means for litigation, you're likely sol.

oschvr 2 days ago

Looks like you made it in lovable. It has that characteristic UI.

If so, how much time did it take you?

  • Botlabs a day ago

    thanks for your comment, it took me almost 3 weeks to build this

snappr021 2 days ago

This type of thing is fairly trivial to create with ChatGPT running entirely locally in HTML.

A couple of kb of open standard vanilla js that does some simple things faster than legacy spreadsheets etc ever could.

Even to the point of creating invoices, reports etc based on standard filters stored in local storage…

o11c 2 days ago

"Large" generally means "bigger than RAM"; 10GB is medium-sized these days since it fits in most people's RAM. Does the browser actually have the (web worker?) APIs needed to stream and "upload" and "download"?

constantcrying 2 days ago

I think it should go without saying, but never use this with anything more relevant than a hobby project.

Doing this with any kind of data you don't fully own (e.g. data from your company) is a terrible idea, from so many standpoints. That it is "allegedly" running locally is not making it much better.

I think my question to OP is, who is this for. Any developer can write up a convert for his own datasets, in basically any case I can think of where you are handling large amounts of data you are building a pipeline to do cleanup, renaming, conversion, etc. Who wants to have a part of that pipeline be uploading the data into the browser?