Community
Showing results for 
Search instead for 
Do you mean 
Reply

Working with v.large Datasets

Bronze Elite Contributor
Posts: 2,115
Country: United_Kingdom

Working with v.large Datasets

Hitting the 2GB CLR limit on object sizes with large dBs.

 

Does anyone know if future compilations of the core source will include the <gcAllowVeryLargeObjects> in the App config if the core framework is now on .NET 4/4.5? It certainly would help very much, either that or is there any chance of getting some further documentation of asynchronous <object>List calls?

 

 

Vivek Gargav
Caldere Associates Ltd.
www.caldere.com
vgargav@caldere.com
My Blog
Nickel Contributor
Posts: 175
Country: USA

Re: Working with v.large Datasets

Just curious about this. Does that limit apply to the 64 bit CLR? I know that 32 bit Windows basically doesn't allow an application to use more than 2GB of RAM, but that limitation isn't there in 64 bit Windows. One would think that the .NET Framework would follow that same logic (but two different teams at Microsoft, so it's anyone's guess).
Silver Super Contributor
Posts: 2,328
Country: USA

Re: Working with v.large Datasets

Vivek,

 

Only you would have problems because you couldn't manipulate a dataset larger than 2 GIGABYTES in a plugin you wrote for ACT!.  You've managed to brighten my day twice today already.  The other time was the post about the improved performance you were seeing with ACT! v16.  :-)

 

Stan


If you would like to get more out of ACT! you can find an ACT! Certified Consultant near you by going to:www.act.com/acc.
-------------------------------------------------------------------------------------
Stan Smith
ACT! Certified Consultant
ADS Programming Services, Inc.
(205) 222-1661
www.adsprogramming.com
www.actwebhosting.com
Click Here to Purchase Act!
Bronze Elite Contributor
Posts: 2,115
Country: United_Kingdom

Re: Working with v.large Datasets

knif,
Sadly the 2GB limit is present in 64Bit OS as well.
Vivek Gargav
Caldere Associates Ltd.
www.caldere.com
vgargav@caldere.com
My Blog
Bronze Elite Contributor
Posts: 2,115
Country: United_Kingdom

Re: Working with v.large Datasets

Stan,

 

Haha, Ok I'm busted for being lazy and not fragmenting/chaining data, but still if the SDK is structured around returning non-generic collections even though we are within a .NET 4 environment and we can't make asynchronous calls, it would make sense to allow big-data. 

Vivek Gargav
Caldere Associates Ltd.
www.caldere.com
vgargav@caldere.com
My Blog
Silver Super Contributor
Posts: 2,328
Country: USA

Re: Working with v.large Datasets

I still think it's funny that the 2GB limit is causing problems for you.  My eyes are actually watering.  :-)

 

Stan


If you would like to get more out of ACT! you can find an ACT! Certified Consultant near you by going to:www.act.com/acc.
-------------------------------------------------------------------------------------
Stan Smith
ACT! Certified Consultant
ADS Programming Services, Inc.
(205) 222-1661
www.adsprogramming.com
www.actwebhosting.com
Click Here to Purchase Act!
Nickel Contributor
Posts: 175
Country: USA

Re: Working with v.large Datasets

Vivek,
A bit of quick looking around MSDN and I see that you are correct. It does look like that limit on 64bit though is per object within the application, and not the whole application itself.

I have to agree with Stan on this, 2GB database, holy moly!
Nickel Elite Contributor
Posts: 937
Country: USA

Re: Working with v.large Datasets

Put this in the product request forum and reference http://msdn.microsoft.com/en-us/library/hh285054(v=vs.110).aspx

 

I don't see any harm in allowing this, but agree with the other posters - a 2GB array - jeesh you are crunching some data.

 

I think this only applies if you are running a plugin - if you reference an instance of the framework (or connect to the data source directly) within your stand alone code then you can cast to the BigArray type (I assume your arrays are numerics), or other methods for avoiding the 2GB limitation.  

 

Any chance we could get an idea how you are using a single, gigantic array?  

Bronze Elite Contributor
Posts: 2,115
Country: United_Kingdom

Re: Working with v.large Datasets

It's actually for an internal utility for migrating data from one dB to another. The Histories and Activities data sets can get very large very quickly when you are dealing with >30 very active users in any organisation and one of the key requirements is keeping upto 5-7 years worth of institutional memory.

 

I grant it's quite eye watering and not all that often, but when necessary it can be a real pain. E.g upgrading to a newer version of Act! from a very much older version. The dB design is completely revamped in addition to redundant fields being deprecated from the new schema. This requires a new dB to be recreated with the new schema structure and design and the old data migrated en masse. It is purely the secondary entities that hold massive amounts of data that can cause traditional 3rd party tools to fall over and so require dedicated internal tools for this kind of task. Ideally this should be a task for something like the bcp utility in SQL Server, but since we don't have enough documentation on the schema for me to feel comfortable and confident to use full CRUD operations at the data layer, I am restricted to using the Framework.

 

I must admit that I am surprised that others have not run into these kinds of issues in the past, I really must be doing something wrong or just repeatedly have abnormally large data hungry clients?? *worry*worry*fret*fret*

 

Thanks for everyone's input BTW.

Vivek Gargav
Caldere Associates Ltd.
www.caldere.com
vgargav@caldere.com
My Blog
Nickel Contributor
Posts: 175
Country: USA

Re: Working with v.large Datasets

[ Edited ]

Edit: I had mentioned breaking the data up, but then I re-read some of the posts and saw that Vivek knew he was being lazy in that regard. So I've removed my comment.

On a side note, yes, I think you have very data hungry clients. I think companies of that size, having 30+ active users would've moved on to one of the "larger" products (Microsoft CRM comes to mind, though I haven't really researched CRM products).