I’ve been profiling the SQL performance of an app that’s been written at the place I work that I currently maintain. The system consists a Windows Forms front end with a SQL back end.
At the moment we also have sets of business objects which have the duty of doing the create/update/retrieve operations on the database.
These objects expose a set of methods for getting single objects or a collection of objects from the database.
The issue I see is that, in a lot of forms the business objects are interrogated several times. Most of the time the business object is retrieved from the database before it’s properties are queried.
Of course this means that a database query is required which means that for certain forms, there are several database hits per second on the database for a single user. At the moment there are only a handful of users, but as more users take up our system I’m concerned that performance may be an issue.
I’ve considered moving all the config data to a set of static classes which only get the data at app startup but the problem there is that when application settings are changed in the administration section of the app, the data needs to be persisted to any clients using the app.
I thought that adding some sort of ‘last settings’ timestamp to a config table might help with that - which would notify the application that the settings needed updating and perform the update before business objects were retrieved again. The issue I can see with this approach is that each time a business object is queried, the check would need to happen on the database again, which could mean many more (albeit small) hits on the database per second.
Has anyone had experience of something like this and can any suggestions be made to get around the problem of keeping up to date settings in memory?