09-11-2013 01:33 AM
Hitting the 2GB CLR limit on object sizes with large dBs.
Does anyone know if future compilations of the core source will include the <gcAllowVeryLargeObjects> in the App config if the core framework is now on .NET 4/4.5? It certainly would help very much, either that or is there any chance of getting some further documentation of asynchronous <object>List calls?
09-11-2013 06:02 AM
09-11-2013 12:22 PM
Only you would have problems because you couldn't manipulate a dataset larger than 2 GIGABYTES in a plugin you wrote for ACT!. You've managed to brighten my day twice today already. The other time was the post about the improved performance you were seeing with ACT! v16. :-)
09-11-2013 02:41 PM
Haha, Ok I'm busted for being lazy and not fragmenting/chaining data, but still if the SDK is structured around returning non-generic collections even though we are within a .NET 4 environment and we can't make asynchronous calls, it would make sense to allow big-data.
09-11-2013 10:08 PM
I still think it's funny that the 2GB limit is causing problems for you. My eyes are actually watering. :-)
09-12-2013 05:31 AM
10-14-2013 11:10 AM
Put this in the product request forum and reference http://msdn.microsoft.com/en-us/library/hh285054(v=vs.110).aspx
I don't see any harm in allowing this, but agree with the other posters - a 2GB array - jeesh you are crunching some data.
I think this only applies if you are running a plugin - if you reference an instance of the framework (or connect to the data source directly) within your stand alone code then you can cast to the BigArray type (I assume your arrays are numerics), or other methods for avoiding the 2GB limitation.
Any chance we could get an idea how you are using a single, gigantic array?
10-14-2013 04:30 PM
It's actually for an internal utility for migrating data from one dB to another. The Histories and Activities data sets can get very large very quickly when you are dealing with >30 very active users in any organisation and one of the key requirements is keeping upto 5-7 years worth of institutional memory.
I grant it's quite eye watering and not all that often, but when necessary it can be a real pain. E.g upgrading to a newer version of Act! from a very much older version. The dB design is completely revamped in addition to redundant fields being deprecated from the new schema. This requires a new dB to be recreated with the new schema structure and design and the old data migrated en masse. It is purely the secondary entities that hold massive amounts of data that can cause traditional 3rd party tools to fall over and so require dedicated internal tools for this kind of task. Ideally this should be a task for something like the bcp utility in SQL Server, but since we don't have enough documentation on the schema for me to feel comfortable and confident to use full CRUD operations at the data layer, I am restricted to using the Framework.
I must admit that I am surprised that others have not run into these kinds of issues in the past, I really must be doing something wrong or just repeatedly have abnormally large data hungry clients?? *worry*worry*fret*fret*
Thanks for everyone's input BTW.
10-15-2013 05:42 AM - edited 10-15-2013 05:44 AM
Edit: I had mentioned breaking the data up, but then I re-read some of the posts and saw that Vivek knew he was being lazy in that regard. So I've removed my comment.
On a side note, yes, I think you have very data hungry clients. I think companies of that size, having 30+ active users would've moved on to one of the "larger" products (Microsoft CRM comes to mind, though I haven't really researched CRM products).