Showing posts with label tasks. Show all posts
Showing posts with label tasks. Show all posts

Wednesday, March 28, 2012

multi-thread tasks using a single connection object concurrently

is it possible to use a single connection object shared by several
tasks where each task is handled by a thread? these tasks call stored
procedures that return record sets, no editing, update or delete.

my objective is that db connection is expensive and each user can only
have 1 connection object. each user submits a request to the web
server, and the result web page construction consists of a number of
result sets obtained from several stored procedure from the same
database instance.

i could obtained my result sets by making each sp call one at a time
but was thinking whether i could instead break up the tasks, create a
thread for each task that shares the same db connection object
concurrently. the connection object will be alive until all tasks are
completed, that is if the final object ive could be achieved in a
shorter time. No connection pooling here.

i am wondering over questions like:

1. within the same db instance, if 2 stored procedures need to access
the same table, does SQLserver queue-up and service one request at a
time.

2. would there be any problem using the connection object this way,
sharing between multiple theads?

3. is it possible that a 'dead-lock' may occur within the DB?

of cos this whole idea can be absurd and into the trash if the
connection object doesn't support multi-thread and is queue based in
the first place.

pardon me if my SQL server basics is off track. been doing too many
other things.

thanks,
mengmeng (hui_km@.star-quest.com) writes:
> i could obtained my result sets by making each sp call one at a time
> but was thinking whether i could instead break up the tasks, create a
> thread for each task that shares the same db connection object
> concurrently. the connection object will be alive until all tasks are
> completed, that is if the final object ive could be achieved in a
> shorter time. No connection pooling here.

I honestly don't know if you can share connection objects between threads,
but in any case it seems like a pointless thing, because the connection
object would be a common resource. You cannot submit to queries on the
same connection object at the same time. ...ah, wait, actually with ADO
you can, but what happens is that ADO opens a new connection behind your
back for you.

> 1. within the same db instance, if 2 stored procedures need to access
> the same table, does SQLserver queue-up and service one request at a
> time.

Depends on how they procedures access the tables. For read operations,
the procedures may well execute in parallel. But if one process starts
to update, the other process will be blocked. (Unless it uses an
isoalation level of uncommitted.)

> 2. would there be any problem using the connection object this way,
> sharing between multiple theads?

As I said, it would be a pointless thing to do.

> 3. is it possible that a 'dead-lock' may occur within the DB?

Yes. That is not to say that you will experience dead-locks, only that
it could happen to you.

--
Erland Sommarskog, SQL Server MVP, sommar@.algonet.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techin.../2000/books.asp|||hi erland,

i was thinking the connection object is like a handle to a database
connection, but if has its own state, and if its not a thread-safe
object, then it would not work at all. even if i can wrap the connection
object in thread-safe codes, if it cannot handle thread context switch,
the idea is useless. the part about ado opening a connection on its own
accord is interesting though. i didn't know that at all.

the piece of info is helping decide how i should implement and write my
app server components. thanks for your response.

regards,
meng

*** Sent via Developersdex http://www.developersdex.com ***
Don't just participate in USENET...get rewarded for it!

Monday, March 19, 2012

Multiple transactions not working in package

I have a package with two sequence containers, each containing two SQL tasks and a data flow task, executed in that order. I want to encapsulate the data flow task in a transaction but not the SQL tasks. I have the TransactionOption property set to 'required' on the data flow tasks and 'supported' on the SQL tasks and the sequence containers. When I run the package I get a distributed transaction error on the first SQL task of the second sequence container:

"[Execute SQL Task] Error: Executing the query "TRUNCATE TABLE DistTransTbl2" failed with the following error: "Distributed transaction completed. Either enlist this session in a new transaction or the NULL transaction.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly."

The only way I can get the package to succeed is to set the TransactionOption = 'required' on the sequence containers and 'supported' on all subordinate tasks. This is not what I want, however. Any ideas?

Thanks,

Eric

Eric,

Can you please send me your package?

So far I couldn’t repro the problem.

|||

Eric, have you tried using a DELETE FROM statement instead of a truncate statement. Truncate doesn't log, so it's very fast, but is probably the reason you're blocking on your second task that's attempting to insert into the table.

K

|||

Hi

I am getting the EXACT same problem, but I am not using any TRUNCATE statements or similar.

Actually, I find that the problem only seems to exist when I try to enlist a Data Flow task in a transaction.

If I have 2 sequence containers, each containing an Execute SQL Task, everything works fine. If I try to replace one of these with a Data Flow Task, then I get the same error.

Does anyone have a solution to this?

|||I could not reproduce the issues reported.

Multiple transactions not working in package

I have a package with two sequence containers, each containing two SQL tasks and a data flow task, executed in that order. I want to encapsulate the data flow task in a transaction but not the SQL tasks. I have the TransactionOption property set to 'required' on the data flow tasks and 'supported' on the SQL tasks and the sequence containers. When I run the package I get a distributed transaction error on the first SQL task of the second sequence container:

"[Execute SQL Task] Error: Executing the query "TRUNCATE TABLE DistTransTbl2" failed with the following error: "Distributed transaction completed. Either enlist this session in a new transaction or the NULL transaction.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly."

The only way I can get the package to succeed is to set the TransactionOption = 'required' on the sequence containers and 'supported' on all subordinate tasks. This is not what I want, however. Any ideas?

Thanks,

Eric

Eric,

Can you please send me your package?

So far I couldn’t repro the problem.

|||

Eric, have you tried using a DELETE FROM statement instead of a truncate statement. Truncate doesn't log, so it's very fast, but is probably the reason you're blocking on your second task that's attempting to insert into the table.

K

|||

Hi

I am getting the EXACT same problem, but I am not using any TRUNCATE statements or similar.

Actually, I find that the problem only seems to exist when I try to enlist a Data Flow task in a transaction.

If I have 2 sequence containers, each containing an Execute SQL Task, everything works fine. If I try to replace one of these with a Data Flow Task, then I get the same error.

Does anyone have a solution to this?

|||I could not reproduce the issues reported.

Monday, March 12, 2012

Multiple tasks in the same assembly

Hello,
All the SSIS tasks (and components come to mention it) exist in their own assembly file.

Is there anything that stops us from putting multiple tasks into one assembly?
Assuming you can do that, is there a reason why this perhaps should not be done?

Thanks
JamieActually, the percent sampling and row sampling transforms are in the same DLL.

There is nothing to stop you from having multiple tasks or multiple components in the same DLL. I can't even think of a reason why you couldn't have both a task and a component in the same DLL. In general, the reason why you wouldn't do that is ease of fix and/or upgrade. If you put 10 transforms in one DLL then if you only changed one you still have to "update" all the others because the DLL has changed. If they are all in separate DLLs then you only update the one that changed (i.e. instead of having to update a 5MB DLL you only have to update a 20KB DLL).

HTH,
Matt|||

Matt David wrote:

Actually, the percent sampling and row sampling transforms are in the same DLL.

There is nothing to stop you from having multiple tasks or multiple components in the same DLL. I can't even think of a reason why you couldn't have both a task and a component in the same DLL. In general, the reason why you wouldn't do that is ease of fix and/or upgrade. If you put 10 transforms in one DLL then if you only changed one you still have to "update" all the others because the DLL has changed. If they are all in separate DLLs then you only update the one that changed (i.e. instead of having to update a 5MB DLL you only have to update a 20KB DLL).

HTH,
Matt

Thanks Matt,

I thought it was a requirement that the DLLs for Tasks went in the ..\Tasks folder and the DLLs for components went in the ..\PipelineComponents folder. How could you do that if they were in the same DLL?

-Jamie|||

Jamie Thomson wrote:

Thanks Matt,

I thought it was a requirement that the DLLs for Tasks went in the ..\Tasks folder and the DLLs for components went in the ..\PipelineComponents folder. How could you do that if they were in the same DLL?

-Jamie


Yes, the same DLL would need to go to both of these places (plus Global Assembly Cache). Still, you build just one DLL.

Michael.|||I was working through this and have decided to go for the multiple assembly approach, because of the deployment and install ease.|||

You may want to consider going one step further and separating Task/Component and User Interface into 2 distinct assemblies as well. This allows you to modify one without modifying the other, although the deployment benefit is a bit less obvious in this case.

In the main Task/Component class, the UITypeName property in the Attribute links your main assembly to the UI assembly.

-Doug

|||

One benefit of separating the UI from the task is that you don't have UI code on the server for security reasons. (Easier to lock down).
The reason we did this in the beginning was to make the task dlls smaller so that if we ever shipped something like MSDE (which has no UI), we could ship the tasks without the UI to make the distribution smaller (for web release etc.)
This might be useful for those writing custom tasks as well, though I can't think of when you might wish to ship a task without the UI.
K