Internal Errors - Locking / Memory issues??

Topics: Issues 3: Executing the Component (Run-Time), Possible Bugs/Missing Features?
Aug 29, 2011 at 3:31 PM
Edited Aug 29, 2011 at 4:08 PM

I am having some issues that I cannot find a solution for, any help would be greatly appreciated!

Packages: I have a main package that calls 62 child packages. Main package is executed with a maxconcurrent of -1 which would allow for 18 packages concurrently running on this server.The Dimension Merge SCD Component is utilized in 13 packages. The total runtime of these packages is just under 5 minutes and the job is scheduled to run every 5 minutes. So basically these packages will start again when they finish.

Server: Quad-Core 2.31GHz / 32GB RAM

I am randomly receiving the following error messages from packages using the Dimension Merge SCD Component when threading is turned on to Automatic:

Internal error (Error building work units: Error getting next work unit - marking all keys matched: Internal error in RuntimeInputCache retrieving current key list.) in ProcessCache_Thread_MatchKeys.

and 

Internal error (Internal error (Unexpected exception in OrderedHashList.Remove removing entry from internal structures: Attempted to read or write protected memory. This is often an indication that other memory is corrupt.) in ProcessKey thread dequeueing a key (69d5160001000000).) in ProcessCache_Thread_ProcessKey.

The number of rows coming through on each run typically varies from 0 to 50,000.

When I changed the job to run only a few times a day I did not see any of these errors.

----

I decided to disable threading in the Dimension Merge SCD components then started the packages back on their every 5 minute schedule and now am getting a new error:

Internal error (Attempted to read or write protected memory. This is often an indication that other memory is corrupt.) in ProcessKey thread dequeueing a key (NullKeyStruct).

 

Any incite into this matter would be greatly appreciated!!!

EDIT: I am only performing SCD type 1. 

EDIT EDIT: I just noticed that the component loads the entire source table even if no records are coming through the dataflow. The existing tables have 200,000 to 5,000,000 rows. Should this be occuring?