There’s nothing very mysterious about why some applications such as Sage Line 50 can run slowly, you just need to understand what the program is doing and it’ll soon be obvious where the problem lies.
Too many people babble on about Client / Server, databases, flat files, caching et al without really having the slightest idea what they are talking about.
What’s needed is to know what these terms actually mean and more importantly how they affect your systems.
To make an analogy we’re going to compare what happens in a program like Sage with checking to see if you have all the ingredients necessary to make a recipe.
(For the professionals amongst you this is not intended as a precise comparison, just to indicate the general way in which things work)
Lets say you are seated comfortably in the lounge with your cook book and all the potential ingredients are in the kitchen.
The recipe calls for a bag of sugar. You get up from your chair, go to the kitchen, look for a bag of sugar in the cupboard and return to the lounge. The next ingredient is an egg, so it’s back off to the kitchen, check the cupboard and return. You repeat this process for every ingredient. It’s not very efficient and takes a long time.
This is what happens when you run a computer program. It has to trot off to the kitchen (hard drive) to retrieve every piece of data it requires.
In a Client Server application by contrast, the client (you) shouts through to the kitchen to the wife (the server) and asks “hey, have we got eggs, sugar etc?” (cooking’s not my strong point). The Server checks in the cupboard and yells back the answer. In the meantime, if you can multitask (apparently impossible for males), you can get on with something really important like drinking beer and watching the TV.
Result is a single interaction without you having to go backwards and forwards to check every single item. Much faster if a little annoying to the wife who may have been doing something else.
There’s a little twist to this scenario called caching which is how computers seek to speed up this backwards and forwards process. Using a cache, the first time you go to the kitchen, you keep a list of everything you found and take it back to the lounge. Then instead of going back to the kitchen for every subsequent item you just look on the list.
This is obviously much quicker, but runs the risk that in the meantime someone has used one of your eggs (changed the data) in which case you have to go back to the kitchen and rewrite your list. This is what is generally called opportunistic locking and works as long as someone tells you they’ve nicked an egg which as in life isn’t always the case.
In a Client Server application, local caching doesn’t really matter as the Server is doing all the work, but caching still has a role to play on the Server itself as it only in turn has to look in the kitchen cupboard once rather than checking each item individually before shouting back the answer.
Now on a single computer the Client Server thing doesn’t really apply as there is only one of you doing all the work but it comes into its own on a network.
With a network you no longer have a kitchen cupboard. All your ingredients are in the grocery shop.
So instead of a short trot to the kitchen you have now got to get in your car, drive to the store, and see if they have the sugar in stock.
Then you drive home
Then you drive back again to see if they have the eggs.
Takes a lot of time.
In a client server program you drive to the store, hand over your shopping list and they (the server) delivers the items to your home (it’s an old fashioned store). Result is only one round trip to the store with them doing all the work instead of you so you can carry on with those important things (drinking more beer) again in the meantime.
Now if the store does caching as well, they will have kept a list of the things you need to buy and they’ll only have to search all the aisles once (unless of course someone else buys something on your list) so even without client server applications, caching on the server speeds up the time taken to find your groceries and deliver them to you.
Unfortunately it all goes horribly wrong when the Store Detective starts work.
They want to check everybody (including store employees) every time someone walks down an aisle and possibly when you park the car and walk in the door.
The more mature detectives can be told to use common sense and just check certain people and aisles but the new brigade (mini filters) are having nothing of this. They check everything on their list even if you’re a valued customer just to make sure your not on the list so every single item you look for will be examined by the detective. Have a nice day sir.
And if the new brigade are in a really bad mood they’ll tear up your list (so we can’t use the cache) each time you visit the aisle.
The bottom line is that Sage and many other non Client Server applications can be quite network intensive for operations that involve searching though large amounts of data.
If the Store Detective (Anti Virus) is too intrusive this can seriously degrade performance.
Keep in mind that if you have 100,000 transactions some processes may involve 100,000 trips to the store, 100,000 conversations with the detective, 100,000 tearing ups of shopping lists (not using the cache) and will quickly compound in performance terms.
You can’t avoid the trips to the store until Sage bring out their new version in 2010 based on MySQL but you can make sure you have a well behaved Store Detective.
In our experience, the number 1 cause of poor Sage 50 performance is Anti Virus software.
For more information see www.sbslimited.co.uk/sageslow.htm