While developing some code to work with a large data table I noticed that the GetById() function was taking several seconds to return a result. The caching was enabled and subsequent calls to the same function were also taking several seconds to execute.
I call my table a large data table but it bothers me somewhat to do so. The table has ~800K rows in it and although it's larger than most tables I work with, it seems like a normal and reasonable thing to have this many entries in a table. I expect objects (in this case the DAL2 repository) that interact with databases should be designed to handle data tables that have at least this many entries.
The DAL2 repository does not handle this many entries well with GetById(). If the repository is used as you would normally expect to use it, you can shoot yourself in the foot with large data sets. My solution was to abandon the use of GetById() and use Find() with custom caching. The full technical details surrounding my experiences and the solution can be found in another post if interested, http://stackoverflow.com/questions/20....