simpledeveloper wrote:Hi I have a table which has 100s of 1000s of records
simpledeveloper wrote:huge json string.
32k maximum size for a cell
1,048,576 maximum rows
Excel specifications and limits - Excel[^]
If you already have a 200,000 rows then a reasonable growth estimate would mean your solution would need to support 2,000,000 rows. And that is not going happen in excel.
You didn't define 'huge' nor what is in that json has but see the cell size limit and column limit. Again if you are already pushing those then growth would suggest you will exceed it.
simpledeveloper wrote:So he wanted me to write a Service to convert that Json string into Excel document,
Is this a one time job? On demand? Once a day? Once a minute? That will impact what is reasonable in terms of how long you your job can take processing this.
simpledeveloper wrote:convert that Json string into Excel document
Is that the only problem that it is json? Basically the acronym 'ETL' exists for this very thing. So why not do the following
1. Create another database, do not try to do this in the existing one.
2. Create a job that incrementally processes the rows and flattens outs the data into the new database.
3. Then the other person uses that database, with the data nicely parsed out to do what they want.
How to flatten it out? Well if the data is nice and stable then you could just create a table with the columns named appropriately.
If however the json is includes lots of depth and is dynamic then you will need to create a metadata solution such as a table that stores a name, and type, and perhaps another table with values. The types allows you to handle arrays and other embedded types (circular). Keeping in mind that this problem is one you would have needed to deal with an the Excel idea any ways.
This also allows the data to be kept up to date. Then it is up to them, not you, how often to actually pull the data.