Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 
danielact
Partner - Creator III
Partner - Creator III

Hardware for large amounts of data

I'm building a dashboard which is bringing in a table that contains about 110 million rows, and will have about 25 columns. Currently, I can't even get it to load properly, because it takes up all my RAM, then the ODBC connection fails. I have the one large table, and then there are some smaller tables that I'm joining to it, so I can end up with one table, so that my final product will run a bit smoother.

I did switch the joins around - I had been loading the huge table, then loading the small tables with the join statements. I am now loading the small tables, then joining them with a  resident statement, then dropping the originals. This was the ODBC connection shouldn't drop. This doesn't fix the problem of it using 99%-100% of my RAM.

My PC currently has 8 GB of RAM, and a dual core 3 gHz processor. Our server has 12 GB of RAM right now. Is this a good amount that should be able to handle a dataset this large, or do I need to upgrade?

0 Replies