@HikariMembro1 anno, 9 mesi fa
Hey all. I just discovered Tabbles and installed it and did a few tests.
I’m using the cloud service. If I like the software and decide to buy a licence, I’m gonna install it in my 2 PCs and a laptop. I’m also buying a NAS to expand my storage and make it easier to share my files. While I move them to NAS, it will be awesome to take the opportunity to go tagging them.
But then I wonder, should I stay on the cloud, or is it worthy taking the trouble to install MSSQL at home? I suppose the export and import zipped xml will allow me to anytime export data from Cloud and import it to local MSSQL, so I suppose I won’t need to worry in needing to restart the tagging work if I decide to do this move.
My concern now is how much resource a MSSQL instance with only Tabble will require.
I use MSSQL 2012 Enterprise at work and it’s really heavy. It loves to eat all available RAM and load as many tables as possible to it. Even when there’s only 1 user (dev env) and not much data stored.
I also know that the required resources deppends on quantity of concurrent users and amount of data. So, again, for a MSSQL Express with only Tabbles and somehow 100K files and 100 tags, how much RAM should I expect to be eaten by it?
@AndreaAmministratore del forum1 anno, 6 mesi fa
Microsoft SQL Express uses a bit over 500 MB of RAM.
If it’s just you, at home, I suggest you stick to the Cloud… else, I suggest you find a Windows based NAS where you can have MS SQL Server running.
@xkatMembro1 anno, 5 mesi fa
this is a follow-up to my post in another thread: doing most tasks, including tagging, searching, Combining tabbles, etc. uses ~450MB (as Andrea notes) on the SQL Express (2014) side with 1-3% CPU spikes and another 50-250MB for TABBLES itself, with 2-15% spiking CPU utilization.
This is with only TABBLES db running, although i run a parallel MSAccess db, linked to TABBLES db in SQLServer, that allows me to manipulate relationships and add other objects and properties to TABBLES.
once again with ~250 tags and maybe 20000 files
Devi essere loggato per rispondere a questa discussione.