Thursday, March 8, 2012

CLR Integration with Complex Types

I am just starting to look into CLR integration with SQL Server 2005.
What I would like to do is:
Pass a string array or object array parameter into a CLR stored procedure
Interate through the elements and for each element update some tables or
call a TSQL stored procedure
I have compiled the .NET class and created the assembly in SQL server.
However when I try to run CREATE PROCEDURE to create the stored procedure
using EXTERNAL NAME I run into problems with not being able to match the .NE
T
parameter types in TSQL. Is it possible to do this or is there another
suggested method for handling the above?
Thanks,
AlistairHello Alistair,
Since T-SQL is the gateway into SQLCLR, you data types must be ones that
SQL Server understands. It doesn't understand arrays, strings or otherwise.
There is a namespace, system.data.sqltypes, that provide .NET types with
T-SQL's semantics that you should be using as well.
So the long story short, you shouldn't pass data in like that. Ideally, you
should be able to write this kind of logic using pure T-SQL (which is better
since its likely to be a lot more performant).
One way to work around this is pass the data you want to munge as XML, but
I doubt that's going to be any easier or faster.
Is there something in particular you're trying to do?
Thank you,
Kent Tegels
DevelopMentor
http://staff.develop.com/ktegels/|||Something like this maybe:
using System;
using System.Data;
using System.Data.SqlClient;
using System.Data.SqlTypes;
using Microsoft.SqlServer.Server;
public partial class UserDefinedFunctions
{
[Microsoft.SqlServer.Server.SqlFunction]
public static int Test1(string rec)
{
if (rec == null)
return 0;
string[] sa = rec.Split(new char[]{','},
StringSplitOptions.RemoveEmptyEntries);
foreach (string s in sa)
{
// Call sproc passing s.
}
return sa.Length; // Return number of elements processed.
}
};
William Stacey [MVP]
"Alistair Harrison" <AlistairHarrison@.discussions.microsoft.com> wrote in
message news:60144A14-099C-4270-9530-B92058491E48@.microsoft.com...
>I am just starting to look into CLR integration with SQL Server 2005.
> What I would like to do is:
> Pass a string array or object array parameter into a CLR stored procedure
> Interate through the elements and for each element update some tables or
> call a TSQL stored procedure
> I have compiled the .NET class and created the assembly in SQL server.
> However when I try to run CREATE PROCEDURE to create the stored procedure
> using EXTERNAL NAME I run into problems with not being able to match the
> .NET
> parameter types in TSQL. Is it possible to do this or is there another
> suggested method for handling the above?
> Thanks,
> Alistair|||Hi,
Thanks for your reply.
I am really still thinking about how all the pieces of the system are going
to fit together.
The situation is that I have several different systems that are all going to
be updating the database. Each is potentially going to want to batch insert
upto four new records in one table and another single record in a second
table based on the four inserts into the first table.
What I had envisiged was passing an array of objects with relevant
properties to the CLR SP (possibly exposed as a native web service) which
could interate over the array and call the relevant TSQL insert SPs. The wor
k
could then all be done local to the database within a transaction and reduce
the amount of calls being made over the network. This is clearly not possibl
e
with the TSQL typing contraints.
The other option seems to be to make several calls over the network from the
client to the relevant SPs and manage the transaction on the client?
I have used XML with SQL Server 2000 in the past to effectively pass an
array of objects and then accessed the data using OPENXML. This seemed to
have a bit of an overhead when opening the XML?
Thanks,
Alistair
"Kent Tegels" wrote:

> Hello Alistair,
> Since T-SQL is the gateway into SQLCLR, you data types must be ones that
> SQL Server understands. It doesn't understand arrays, strings or otherwise
.
> There is a namespace, system.data.sqltypes, that provide .NET types with
> T-SQL's semantics that you should be using as well.
> So the long story short, you shouldn't pass data in like that. Ideally, yo
u
> should be able to write this kind of logic using pure T-SQL (which is bett
er
> since its likely to be a lot more performant).
> One way to work around this is pass the data you want to munge as XML, but
> I doubt that's going to be any easier or faster.
> Is there something in particular you're trying to do?
> Thank you,
> Kent Tegels
> DevelopMentor
> http://staff.develop.com/ktegels/
>
>|||Hello William Stacey,
True, but if you're simply producing a vector of values to work with, a TVF
might be a more efficient choice since it would give T-SQL a table to work.
There's an example of that at:
http://www.sqljunkies.com/WebLog/kt...1/17/17416.aspx
Thank you,
Kent Tegels
DevelopMentor
http://staff.develop.com/ktegels/|||Hello Alistair,

> What I had envisiged was passing an array of objects with relevant
> properties to the CLR SP (possibly exposed as a native web service)
> which could interate over the array and call the relevant TSQL insert
> SPs. The work could then all be done local to the database within a
> transaction and reduce the amount of calls being made over the
> network. This is clearly not possible with the TSQL typing contraints.
Well, you could packetize the records into an XML document and pass that
as a string, then use something to rehydrate that.
Note the the HTTP ENDPOINT stuff in SQL 2005 also allows for the submission
of batches which might make this somewhat eaiser.
Is your primary goal to make the least calls or to send the least amount
of data over the wire or to write the least code possible or to have the
easist code to maintain or what?

> The other option seems to be to make several calls over the network
> from the client to the relevant SPs and manage the transaction on the
> client?
Yep, and use the TransactionScope class to that. Have two procs, one for
each table's insert. You should be able to use MARS to do that one connectio
n
and use TransactionScope to manage it.

> I have used XML with SQL Server 2000 in the past to effectively pass
> an array of objects and then accessed the data using OPENXML. This
> seemed to have a bit of an overhead when opening the XML?
Here you wouldn't. You'd write a single SQLCLR sproc that took in the XML
as a parameter (type string or SqlXml, shouldn't matter much). You'd then
use an XmlReader to parse out the values and do the inserts. I could be talk
ed
into posting an example if needed.
Service Broker may or may not be useful here depending on your actual goals.
I've never been a big fan of sending object graphs when messages would do
but that's just me. :)
Thank you,
Kent Tegels
DevelopMentor
http://staff.develop.com/ktegels/|||Hello Kent,
Thanks for your response.

> Is your primary goal to make the least calls or to send the least amount
> of data over the wire or to write the least code possible or to have the
> easist code to maintain or what?
>
The devices running the client application are going to be numerous and
quite varied, therefore I had hoped to keep the client application quite
lightweight. This is the reason I was looking at making a single call with a
n
array of objects and managing the transaction on the server. The server woul
d
then return whether the updates were successful or not and the client can
simply respond to this rather than managing the logic and transaction as wel
l.
This approach should also help maintainance as if the DB needs to be changed
the data and logic can all be changed on the server rather than having to
release to the numerous client devices.
Is it more efficient to structure things like this?
It seems that from what you have said that the best way to achieve this
would be to pass the data from the client to the server in a single call
using XML. It would be interesting and also helpful to see your example of
extracting the data from the SqlXml type in a CLR SP if you wouldn't mind
posting it?
I had also been thinking that I could use a .NET webservice to allow the
client to pass an array of objects to the server and then the webservice
could interact with the database? This would also allow the call from the
clients to server to be asynchronous.
Thanks again for your advice,
Alistair|||Hello Alistair,

> The devices running the client application are going to be numerous
> and quite varied, therefore I had hoped to keep the client application
> quite lightweight. This is the reason I was looking at making a single
> call with an array of objects and managing the transaction on the
> server. The server would then return whether the updates were
> successful or not and the client can simply respond to this rather
> than managing the logic and transaction as well.
Okay, that can make sense, just understand that you're ultimately trading
scalability for device reach. At worst it means you need a bigger box(es)
to host the SQL processes on.

> This approach should also help maintainance as if the DB needs to be
> changed the data and logic can all be changed on the server rather
> than having to release to the numerous client devices.
Ideally, until you have to change an API anyway.

> Is it more efficient to structure things like this?
It could be, again, it really depends on what you're trying to be efficent
at. Its the path I'd go down so long as I knew I was dealing with more than
say 5,000 current devices hammering the box at once. Generating the client
side XML may or may not be terribly efficient either. I'd probably schema
bind the XML too -- just to be safe.
Note that XQuery isn't terribly fast and you might get overall better perf
out of using SQLCLR to do the parsing. I'm working on that example, but ther
es
some stuff with Integration Services I need to get out of the way first.

> It seems that from what you have said that the best way to achieve
> this would be to pass the data from the client to the server in a
> single call using XML. It would be interesting and also helpful to see
> your example of extracting the data from the SqlXml type in a CLR SP
> if you wouldn't mind posting it?
http://www.sqljunkies.com/WebLog/kt...er.aspx


> I had also been thinking that I could use a .NET webservice to allow
> the client to pass an array of objects to the server and then the
> webservice could interact with the database? This would also allow the
> call from the clients to server to be asynchronous.
True, but then how would your client really know if the update was successfu
l
or not? Sure, they could "fire and forget," but is that pattern you were
really going for?
Another tactic to consider would be use a combination of HTTP Endpoints (thi
nk
ASMX host by SQL Server) that takes the object graph as a parameter and then
passes into a Service Broker (SSB) application for processing.
Lots of ways to skin this cat. I don't suspect the cat will like any of them
.
:X
Thank you,
Kent Tegels
DevelopMentor
http://staff.develop.com/ktegels/|||Hi Kent,
Thanks for the insight regarding CLR integration with SQLServer.
I have a requirement where the external class has a method that returns a st
ring. I cannot replicate the object in transact-sql because of the algorith
ms in the object. Is there any way to pass the method value of string to SQ
LServer. I have the assembly loaded and the a scalar function to return the
value however I am getting the incompatible object type error on I try to s
ave the altered function.
Any suggestion would be greatly appreciated.
Thanks,
glenn|||Hello Kent,
Thanks again for your response. They are giving me plenty to think about...
I have been experimenting with the devices and database server and have come
to the conclusion that it would make life easier to not have the data access
dlls on the client devices. Basically the files are installed to a non
durable location so when the devices are reset the files are lost which
causes problems on startup.
This seems to leave two options:
- use a SOAP HTTP endpoint in SQL server which could take XML as an input,
this could be a CLR sproc which would allow easy interation through the
elements making relevant inserts or TSQL sproc calls
- use a .NET web service which can take an array of objects and make several
calls to TSQL sproc the database within a single transaction
The main differences between the two options seem to be that using a .NET
web service would eliminate the need to generate an XML representation of th
e
array of objects on the client as this will automatically be serialized by
SOAP. This options requires using IIS where as the HTTP endpoint option only
requires SQL Server. The transaction across multiple sproc calls/inserts wil
l
also have to be managed in .NET by the web service. Is there any divantag
e
to this?
Something else I have found is that there is little or no support for XML
generation or serializing things in general in the .NET compact framework 1.
1
so I would have to write custom code to generate the XML.
In terms of asynchronous calls I would definitely need a response to say
whether the call was sucessful or not. I had been thinking that the web
service could call back to the client to save it waiting for a response.
Cheers,
Alistair Harrison

>
> True, but then how would your client really know if the update was success
ful
> or not? Sure, they could "fire and forget," but is that pattern you were
> really going for?
> Another tactic to consider would be use a combination of HTTP Endpoints (t
hink
> ASMX host by SQL Server) that takes the object graph as a parameter and th
en
> passes into a Service Broker (SSB) application for processing.
>

No comments:

Post a Comment