0 Replies Latest reply: May 22, 2013 2:08 PM by Norman Livermore RSS

    Large Data Model and Performance.

      I have an application in development which is going to be accessing a fairly large number of transactional data records.  12 million per quarter and I hope to include a full year of data so that means about 48 Million.   (I have already aggregated as much as I can).

       

      It is possible to divide my Fact table into to discrete pieces about 2/3 in one and 1/3 in the other.  Most of the charts do not use both so there could be some savings there however a link table would be needed so not sure how much.

       

      My question is What is the point where performance will be an issue?  Have I already exceeded this at 20+ million rows?  Testing with 8 million row tables already has its challenges.  Requires server strength to keep out of memory issues from appearing.  I do expect that the application will be full featured with many charts and tables.  

       

      Is it worth the separating of the two groupings?  Will such take advantage of multi-threading or multi-processor capabilities?