![]() ![]() ![]() ![]() SQL may not be returning any data when you exceed 8000 characters). I suspect that your problem may be that the size of the XML data you are receiving exceeds 8000 characters. I use sp_OACreate extensively to do the sort of thing you are doing. Using it to feed the XML is faster than doing a pure loop however, but the size of the XML can be quite variable (esp given that a binary would pass it to the stored proc using L2S) and Iv'e already overflowed the input to the SP a couple of times because there's been too much data, hence why I'm looking to get the SP to retrieve the data Browstone - SSIS seems to be the way to go, that's what other suggestions have been, I do however admit I don't know an awful lot about using SSIS other than to use the (Import / Export wizard - I'm more of a dev than a DBA :-) ) so any pointers on how I'd achieve this using SSIS would be helpful.Īs for using CLR procs in the DB, I had thought of trying that, but Iv'e been bitten badly using these things before (And crashed a couple of servers too!!) so I'm a little wary of using them. #DTSQL DOWNLOAD CODE#I suspect, that it's something to do with my feed server's config however, so need to do some work to resolve this, I thought I'd add the code here for thank's for the suggestion, but that's how I currently do the task, I have a CLR binary that I wrote, that runs once a day to sync the feed to the DB, but it takes too long to sync the data. The code above however does work with correctly formatted XML from the internet, for example 'geonames' (which is what the original source of the above code was based on) works fine. #DTSQL DOWNLOAD FULL#If I use the full URL, which returns pure XML data with a mime type of 'text/xml' then response text contains nothing, it's null but if I strip it back to say ' or ' so that the web server is just passing a 404 page or default html page, then I get the actual page content in.Īt first I thought it might be the mime type, 'text/xml' vs 'text/html' but testing that made no difference, anything that returns valid XML seems to give null, but anything that returns broken XML seems to work!! Select = ' exec sp_OACreate 'MSXML2.XMLHTTP', OUT Įxec sp_OAMethod 'open', NULL, 'get', 'false'Įxec sp_OAMethod 'responsetext', OUTPUTĮxec sp_OADestroy sp_xml_preparedocument OUTPUT, seems to work, and work quite efficiently, however here's the weird thing. What I'm wanting to know, is instead of using a 3rd party program to get this data from the service then call the stored proc and insert it, is it in anyway possible for me to get SQL Server to just grab the XML directly from the service URL and process it like that.Īll I can seem to find no matter how much I try is plenty of articles & posts on reading / writing files to / from the server file system and the several million or so on actually inserting the XML data, but I can't seem to find anything on getting the server to grab data directly from a URL.Īfter a little bit more research Iv'e found a third way to do this: declare as int Getting the XML into a table is no problem I can already do that, but what I'm having to do at the moment is pass a huge long string of XML to a stored proc which then uses the OPENXML command to insert it into the table. The data in these services needs to be mirrored into a Table on an SQL Server 2008 instance. I have a set of XML based services running on my network. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |