Horkay Blog
The postings on this site are my own and do not represent my Employer's positions, advice or strategies.
Thursday, 25 February 2016

Recently had some SQL Agent jobs that run SSIS packages start failing, with an error of "Failing Pending Execution".

Could find no real issues.

Thanks Tom !,

Finally found some relevant links related to Group Policy being changed that caused this.

  • Impersonate a client after authentication
  • Replace a process level token
  • Adjust memory quotas for a process
  • http://stackoverflow.com/questions/23643967/pending-execution

    http://sirsql.net/content/2015/03/10/ssis-job-steps-fail-pending-execution/

    Thursday, 25 February 2016 09:22:28 (Central Standard Time, UTC-06:00) | Comments [0] | SQL Server |  SQL Agent |  SSIS | Web_Blog#
    Sunday, 09 June 2013

    Backups are still the most basic and important tasks performed by a DBA. 

    Recent case in point, received this from a SAN Administrator:

    ------

    I just spoke with {vendor named removed}, we have had a triple hardware failure on the array. {Vendor} is in the process of replacing 3 drives at this time.

     

    Apparently we had one drive fail, and while that drive was transitioning to a hot spare, a second drive in the same RAID group failed. The array wasn’t able to handle the double simultaneous failure. {Vendor} attempted to follow procedures when replacing the drives to restore the data, but they were unable to, and all the affected data was lost.

     

    Currently {Vendor} is following a process that will eventually make the LUNs available to the servers as they were before, but these LUNs will be empty, all data on them has been lost.

     

    We’ll work with {Vendor} to get a root cause analysis completed, but all data that was on the affected storage will have to be manually replaced.

    -----------------

     

    Always make sure you have your backups.

     

    Sunday, 09 June 2013 06:29:06 (Central Standard Time, UTC-06:00) | Comments [0] |  SSIS#
    Tuesday, 18 December 2012

    Around the farm I always used to carry knife, a long time ago I switched to a Leatherman multi-tool, it just has more utility.  SSIS provides that same utility.

    Recently I ran into a need to download an XML File, that had URL's in it to other files that needed to be downloaded via HTTP and imported into a database.  What a pain in the ass.  XML is not my favorite, and downloading via HTTP isn't either, my preference would have been just to do an FTP get with a wild card, but architects are not DBA's and why would you consult with someone technical for the requirements ?

    Anyway, SSIS made this job very simple, use XML and loops to download and import data.  It became very simple and SSIS impressed me.

    What made this example even more fun was the XML was malformed, and I had to use regex to remove the malformed lines !   That is a whole other topic on how to remove comments from XML.

    The key to doing this is embedding a script task in a loop to do the download,  the rest is basic SSIS import stuff, here is what I used:

    string xmlDocFile = "";

    try

    {

    // Logging start of download

    bool fireAgain = true;

    Dts.Events.FireInformation(0, "Download File", "Start downloading ", string.Empty, 0, ref fireAgain);

    // Create a webclient to download a file

    WebClient mySSISWebClient = new WebClient();

    // Download file and use the Flat File Connectionstring (D:\SourceFiles\Products.xml)

    // to save the file (and replace the existing file)

    // build dynamic download URL for {website removed to protect the innocent}

    mySSISWebClient.DownloadFile("http://somewebsite.com/SubDir/" +

    Dts.Variables["YearMonthDay"].Value.ToString() + "/xml/siteData.xml",

    "C:\\server\\Download\\" +

    Dts.Variables["YearMonthDay"].Value.ToString()

    + "_siteData.xml");

    // set the user variable, used in the data import

    Dts.Variables["xmlimportfile"].Value = "C:\\server\\Download\\" +

    Dts.Variables["YearMonthDay"].Value.ToString() +

    "_siteData.xml";

    xmlDocFile = "C:\\server\\Download\\" +

    Dts.Variables["YearMonthDay"].Value.ToString() +

    "_siteData.xml";

    // Logging end of download

    Dts.Events.FireInformation(0, "Download File", "Finished downloading", string.Empty, 0, ref fireAgain);

    // Quit Script Task succesful

    Dts.Events.FireInformation(0, "Download File", "Starting remove comments.", string.Empty, 0, ref fireAgain);

    System.IO.StreamReader file = new System.IO.StreamReader(xmlDocFile);

    string validXml = Regex.Replace(file.ReadToEnd(), "<!--.*?-->", "");

    file.Close();

    Dts.Events.FireInformation(0, "Download File", "writing file", string.Empty, 0, ref fireAgain);

    System.IO.StreamWriter writefile = new System.IO.StreamWriter(xmlDocFile,false);

    // System.IO.StreamWriter writefile = new System.IO.StreamWriter("c:\\server\\bob.xml");

    writefile.Write(validXml);

    writefile.Close();

    Dts.Events.FireInformation(0, "Download File", "Comments removed.", string.Empty, 0, ref fireAgain);

    Dts.TaskResult = (int)ScriptResults.Success;

    }

    catch (Exception ex)

    {

    // Logging why download failed

    Dts.Events.FireError(0, "Download File", "Download failed: " + ex.Message, string.Empty, 0);

    // Quit Script Task unsuccesful

    Dts.TaskResult = (int)ScriptResults.Failure;

    }

    }

    }

    Tuesday, 18 December 2012 10:23:44 (Central Standard Time, UTC-06:00) | Comments [0] | SQL Server |  SSIS#
    Wednesday, 04 November 2009

    Don't expect your queries to be in order without a select statement, guess what same goes for SSIS!

    Today I had to trouble-shoot a package (not developed by me or our SQL DBA Group), but regardless, it becomes our problem !

    SSIS was exporting a table to a flat file, turns out this flat file is really an XML File.  The table is populated with XML from a stored procedure prior to the export.  The data pump ole db source was a table, turns out it's not always in order, resulting in malformed XML.  Change the source from table to a select statement with an order by clause and all is working well, fortunately there was a rowid (identity) column in the table already.

    Frustruating to the end user, as it was a package that had been working fine for months.

    Of course there was no primary key, making this a heap table; but either way, if the requirement is to have the data ordered, than you need an order by statement, even if the data is "in the table that way", and even for SSIS.  [Be nice if Microsoft put a warning on there that the data access mode may not be in order.]

    This seems to be more of a problem on heap tables (no primay key) and Enterprise edition sql (which supports Merry-go-round scans [i think that's what it's called]).

     

    Wednesday, 04 November 2009 15:30:59 (Central Standard Time, UTC-06:00) | Comments [0] | SQL Server |  SSIS#
    Friday, 23 October 2009

    The patching game.

    Better give me a whole lotta lumps.....

    Last week Microsoft released a patch for a SQL Server Security vulnerability.  This week Microsoft released SQL Server 2005 CU 6 (which includes the security patch and some other fixes).

    Test those "other fixes" carefully.

    We just ran into a change in behavior for how SSIS evaluated variables that are passed into Execute SQL Tasks as parameters, Not sure exactly which patch introduced the change as they are cumulative and we were running CU 2 previously, so anything from CU 3 on up could have been the cause.

    We were running on version 9.00.4211, which is SQL 2K5, SP3 CU2.  We upgraded to the Security Fix (9.00.4262), and the SSIS package broke.  Hoping not to have to trouble shoot the issue and needing to be on CU 6 anyway, we applied it, hoping it would fix the issue...no luck, time to trouble shoot.

    The issue is really one of poor coding practice.

    There are some variables passed into an execute sql task to dynamically create a where clause. 

    The variable in question was declared as an Integer

    The variable was than passed into the execute sql task, but it was of a varchar type, below is the corrected parameter mapped as a numeric.

    The execute sql task is shown here, the parameter is the first one evaluated with the if "? <> 2".

    This execute sql task passed the parameter in is as '', blank, if the parameter mapping was not set properly.  I confirmed this behavior behind the scenes with a profiler trace as well.  Interestingly even with the parameter mapped properly as numeric, it is still passed into sql server as a varchar !  But with the parameter mapped as numeric, the varchar is properly populated with value of the variable being passed in and the statement completes successfully.

    Test those SQL Server patches carefully, there is always the off chance of unintended behavior change.  Ultimately it was "sloppy coding" to be passing an integer variable in ssis, to a parameter mapped as varchar.  But it has been running that way for 14-16 months without issue.

    Test carefully.

    9.00.4211.00 - 2005 SP3 CU 2
    9.00.4262.00 - QFE Security Fix
    9.00.4266.00 - 2005 sp3 cu 6

    In response on where to get sp3 cu 6:

    http://support.microsoft.com/kb/974648/en-us


     

    Friday, 23 October 2009 08:04:41 (Central Standard Time, UTC-06:00) | Comments [1] | SQL Server |  SSIS#
    Tuesday, 04 August 2009

    This post may raise more questions for you than it answers, but it really is something that happenned in my shop and confused some people.  So I'm creating this post as a place holder so I'll be able to reference this 10 months from now when the same issue comes up and I've forgetten the solution.

    Basically for whatever reason we use Excel, darn !  No matter how I try and recreate these Excel sheets in Reporting Services, some business users won't bite and they still need excel files.  Generally these SSIS to Excel data pumps work fine....the ones that generally trip the process up are ones with long text and multiple "named" work sheets.  To solve the "named" work sheets problem we often use an Excel template file that is copied to the destination and renamed, than the data pump occurs.  As we migrate the SSIS packages from Development, QA to production, sometimes an "odd" excel worksheet gets introduced, usually named "sheet1", lovely, below is the procedure we used to correct this...

    Thanks to Tom Reeves from our team.

    To Delete Unwanted sheets in Excel for use in SSIS

    1) Map a local drive to the E: drive of the app server (ServerNameHere\E$)
    2) Open template file on server and delete unwanted tabs
    3) Open SSIS package
    a. You’ll need to make note of the tab name(s) that you are working with
    b. Copy query from the source object in the data flow
    4) Create a new DTS package – SQL2000
    a. 1 SQL Connection
    b. 1 Excel Connection
    c. 2 Execute SQL Tasks
    i. Both pointed to the Excel connection manager
    ii. One for Drop table and one for Create table
    5) Open the Transformation flow between the SQL connection and the Excel connection - DTS
    6) Copy and paste the query from the SSIS package into the Source tab
    7) Click the Create button on the Destination tab and copy the code from the window.  You can hit cancel on this screen now.
    8) Open one of the Execute SQL Tasks and set the connection manager to the Excel connection and type your Drop sql statement. 
    a. Example: drop table `SheetNameHere` - No dollar sign in the table name
    8) Open the other Execute SQL Task and set the connection manager to the Excel connection and type your Create sql statement.
    Example:
    CREATE TABLE `SheetNameHere` (`Division_Code` Decimal (2,0) ,
    `Division_Dept_Id` Decimal (7,0) ,
    `Office_Dept_Id` Decimal (7,0) ,
    `Last_Update_Date` DateTime ,
    `Reason_Text_Opt_out` LongText ,
    `Reason_Text_Opt_in` LongText ,
    `WHQ_Comments` LongText ,
    `WHQ_User` VarChar (30) ,
    `WHQ_Last_Update_Date` DateTime ,
    `military` Short ,
    `office_type_code` VarChar (1) ,
    `ABC` VarChar (10) ,
    `Client_Segmentation` VarChar (50) ,
    `number_of_windows` Long ,
    `number_of_desks` Long ,
    `Number_of_bilingual_tax_pros` VarChar (50) ,
    `Year_round_indicator` VarChar (1) ,
    `Latino_designation_year_prior` Long ,
    `office_status` VarChar (1) ,
    `Notary` Long ,
    `ITIN` Long ,
    `Spanish_Speaking_Preparer` Long  )

    Tuesday, 04 August 2009 12:13:00 (Central Standard Time, UTC-06:00) | Comments [0] | SQL Server |  SSIS#
    Friday, 10 April 2009

    Remember this means buying it and paying maintenance.  SSIS is a great tool, even if you did have to buy it, but because it's free and bundled with SQL Server 2005/8, we rarely look at other ETL Tools with an open mind.  Recently my list of SSIS Pet Peeves and the Toughest SQL Server 2005 Migration ever, caused me to wonder, "Would I use SSIS if it wasn't free and bundled with SQL Server?"

    I know the company i work did a RFP for ETL Tools, but I was never privy to the responses or even allowed to participate in the process ( why include DBA's in the search for an enterprise ETL Tool, that'd be foolish!).  Nothing ever came from this, so that leads me to think that SSIS must still be the best value out there, probably because it's free.  I sure would have liked to seen the responses, especially on cost.

    I've found a nice web page on ETL Tools, though they want you to purchase their comparison, but if nothing else you can see a nice list of ETL Vendors:

    http://www.etltool.com/etltoolslist.htm

    No. List of ETL Tools Version   ETL Vendors
    1. Oracle Warehouse Builder (OWB) 11gR1 Oracle  
    2. Data Integrator & Services  XI 3.0 Business Objects, SAP
    3. IBM Information Server (Ascential) 8.0.1 IBM
    4. SAS Data Integration Studio 4.2 SAS Institute 
    5. PowerCenter 8.5.1 Informatica  
    6. Elixir Repertoire 7.2.2 Elixir 
    7. Data Migrator 7.6 Information Builders
    8. Integration Services (SSIS) 10 Microsoft  
    9. Talend Open Studio 1.1 Talend
    10. DataFlow Manager 6 Group 1 Software (Sagent)
    11. Data Integrator 8.12 Pervasive
    12. Transformation Server 5.4 IBM DataMirror
    13. Transformation Manager  5.2.2 ETL Solutions Ltd.
    14. Data Manager/Decision Stream 8.2 IBM Cognos
    15. DT/Studio 3.1 Embarcadero Technologies
    16. ETL4ALL 4.2 IKAN
    17. DB2 Warehouse Edition 9.1 IBM
    18. Pentaho Data Integration  3.0 Pentaho  
    19. Adeptia Integration Server 4.9 Adeptia
    20. Clover ETL 2.5.2 Javlin  

    Interesting Blog post on evaluating some ETL Tools:
    http://it.toolbox.com/blogs/evaluating-software/evaluate-etl-tools-4169

    If anyone has any experiences with other ETL Tools and how they compare with SSIS, i'd love to have you comment on this, letting me know why your using a different tool.  After working with SSIS now on several large projects I'm now very comfortable with SSIS and understand it's quirks, depending on it's "cost" compared to other products, I'd use it, even if it wasn't free.  I hope Microsoft doesn't read this and decide they want to start charging for ssis (and for sure there are already some features of SSIS that aren't available under certain editions).

    One of the greatest strengths of SSIS is how quickly we (DBA, User, Developer, BA, QA Etc) can get at the data either for query, analysis, import or export...SSIS and it's predecessor DTS were one of SQL Server's greatest strengths in this area.  I work in a shop with other DBMS's and consistently SQL Server is more flexible in this area, because of SSIS/DTS.  I do think they could have made SSIS Better than it is, the learning curve compared to DTS is steep and the strong data typing of the meta data can be difficult at best.

    Friday, 10 April 2009 12:50:56 (Central Standard Time, UTC-06:00) | Comments [0] | SQL Server |  SSIS#
    Thursday, 09 April 2009

    Recently I finished my first deep dive into using SQL Server Integration Services (SSIS) on a project.  This was the conversion of 88 fairly complex DTS Packages from SQL 2000 to 2005.  I've used SSIS numerous times in the past, but I still considered myself at the "beginner" level, I'd say i'm almost "Intermediate" now that I've completed this project. 

    All of our SSIS Packages for this project were implemented on a separate layer from the dbms, ultimately allowing application administrators the ability to control the jobs (via windows task scheduler).   This is one of the greatest benefits of SSIS is the ability to separate them from SQL Server.  The three cons to this are loss of SQL Agent for scheduling,  SQL Licensing and Performance of some tasks (can't use SQL Destination).   The pro's out weigh these con's, and in the very few instances where a con can't be overcome we do allow the ssis package to run on the same layer (usually large loads from Oracle to SQL where the SQL Destination object must be used for performance). 

    It is definitely nice to allow the administrator unfettered access to the box and they are completely responsible for developing, deploying, configuring and scheduling their own SSIS Packages, not the SQL DBA Team.  We have 4 SQL Server DBA's, 170 Instances; there are a lot more App/Dev labor hours than their our SQL DBA labor hours.  The App/Dev team loves it too, it takes them a bit to get used to it, but everyone of them loves it.  Make sure you have good SLA's as DBA's still need to understand when large loads and complex ETL's will run against a database.

    During the process I compiled a list of "pet peeves" for SSIS, here they are.  Hopefully as we adopt SQL 2008 SSIS, i'll be able to cross some of these off.

    • Excel (the list of issues seems endless)
      • Columns longer than 500 Characters
      • Strong data typing (Data conversion on 50+ columns is brutal!)
      • Mapping columns after data conversion, ugh !
    • Using variables for dynamic file names is not intuitive
      • This is timestamps in filenames etc, once learned it's not difficult, but for others viewing the package the first time it can be convoluted to determine how the variables are used to manipulate the name of the file source
    • Using wild cards for FTP (either put or get) is not intuitive
      • Converting some previous DTS Tasks would perform a put (mput) of {timestamp}.*, this ended up either remaining a batch file in ssis or creating a for loop that actually performed individual puts (really not smart to login 10 times for 10 puts).
    • Mapping columns in data flow tasks
      • If the names don't match there is no option to just do a 1:1 in order match, frustrating and tedious when your dealing with 50 or a 100 columns to perform manually
    • Configuration Files
      • A must to implement, but just seems to have a high learning curve
    • Restart of a failed packages (checkpoints)
      • Just didn't work the way we expected, but once learned was quite successfully
    • Oracle Integration
      • The pain here was and is endless, 3rd party tool is really necessary if your dealing with lots of integration
    • Default Properties for Fail Parent and Fail Package
      • For us it seemed we always wanted this to true and the default was false, there seemed to be no way to globally change this
    • Precedence Constraint Editor does not contain an expression builder
      • Oh how I missed the 3 ellipses
    • Executing another SSIS Package from an existing SSIS Package
      • We have several "shared" packages, but we weren't able to use the execute package task due to limitations on changing the properties, so we ended up using the Execute process task, which calls a batch file with the proper configuration file...ultimately the execute package task was useless
    • Readability of Configuration files makes editing difficult
      • We end up using XML Notepad on most servers, it'd be nice to see the property name as part of the node or an element so one could visually scan down the side and find things quicker.
    Thursday, 09 April 2009 13:02:53 (Central Standard Time, UTC-06:00) | Comments [0] | SQL Server |  SSIS#
    Friday, 03 April 2009

    We're still not ready to support sql 2008 (Company Infrastructure, vendors and DBA Group are a little slow), maybe this June or July we'll do the first sql 2008 implementation).

    For me, most SQL Server migrations have been easy.  No mess, no fuss, hit the "easy button".  We even do the migrations the "hard" way...new servers, named instance, usually to a new clustered or consolidated environment using Polyserve or VM, and we separate all DTS (ssis) and User jobs out to an application (ETL) server.  As a DBA with 100's of servers it's a lot easier to let each application group manage their own jobs, off the sql server, than for us to handle the requests.  Usually the most difficult issue is connectivity, firewall and connection strings.  Of 170 instances we support very rarely is there a difficult issue, we've dealt with some odd performance issues here and there, but for the most part it's been smooth sailing (hah, I remember being up for several days straight monitoring a system with 5K tps and watching it crash every day, but that's the extreme end of the bell curve).

    Finally we couldn't put it off any more, and it was time to tackle the elephant in the room. 

    It's not a big Instance (5 databases, 60gb), but the challenges are big.  We're dealing with 88 DTS Packages and 45 sql agent jobs, Connectivity to multiple Oracle databases, AS400 and other SQL Servers, 60+ downstream consumers of data.  Excel Flat Files, Text Files, Web services, 3rd party vendor imports and exports.  Let the games begin.

    I knew I was screwed when I de-attached the databases from SQL 2000 and attached them to sql 2005 and got the following errors:  "Attach database failed for Server x.  An exception occurred while executing a T-SQL Statement or batch.  Converting database x from version 539 to 611.  Microsoft SQL Server, Error 195. fn_convertdatetoint_notime is not a recognized function name.  Incorrect syntax near the keyword 'left'".  Ultimately we determined that this issues was caused by user functions in SQL 2000 setup in the master database, NOT Supported in sql 2005.  We also ran into isolated issues where there were tsql syntax errors in sql 2000 that did not throw any errors, but would not compile in 2005!

    Linked Server Issues were also encountered.
       Setting up a linked server to the AS400 on 64 Bit windows
          Required a new version of the IBM iSeries driver
          Required a patch to the IBM iseries driver
          Required patching windows to install 64 bit odbc drivers
       Setting up a Linked server to SQL 2000 SP4
          Required patching sql 2000,
    http://support.microsoft.com/kb/906954
       Setting up a linked server to Oracle

    Over-all the process took about a month (with 4 DBA's and other work), bulk of the work was the DTS to SSIS packages.  We did look at some 3rd party utilities, but they still left several hours work for each package, so we just did the brute force method and did them one at a time.  Currently the process will spend 2-3 months running in parallel and being QA'd, than onto production.

    Hit the "next" button.

    Friday, 03 April 2009 14:03:22 (Central Standard Time, UTC-06:00) | Comments [0] | SQL Server |  Linked Server |  SSIS#
    Wednesday, 25 March 2009

    Recently I ran into an issue where I needed to edit a large number of dtsConfig xml files.  I thought there was an option in the windows operating system to perform a find and replace across multiple files, but alas I can't find it...

    Than I remember a vb script I had hobbled together from several other people to do this.  There are several share ware, free ware and other programs to do this, but I really think that find and replace is so basic of a task that it should be easy to kick out...

    So here it is, remember that vbs files are a security issue when downloading from the internet, so i've uploaded them with a .txt extension.  Rename it to .vbs.  There are two programs, a replace.vbs and Replace_in_All_files.vbs.  The latter is the one that you must edit with the directory to look for files, extension of files to search and the find and replace string.

    Use at your own risk.  The program will make a backup of the file.  VB Script is not the most efficient find and replace program for very large files, but if you have 100 or so dtsConfig files to edit, it works like a charm !

    The same type of program can be done in a Power Shell script, and it is the cool new buzz language, but since it is only installed on 1 out 170 servers we support, it's not quite useful yet....i'll post the power shell script later.

    replace.txt (1.76 KB)

    Replace_in_All_files.txt (1.5 KB)

    Wednesday, 25 March 2009 11:37:40 (Central Standard Time, UTC-06:00) | Comments [0] | General Technology | SQL Server |  SSIS#
    Friday, 21 November 2008

    Recently I ran across an interesting problem where we had to kick off an SSIS Package on a remote server, i found the following to help with this:

    I tried psexec and it's great. A thing of beauty. Simple, clean and it just works.

    http://technet.microsoft.com/en-us/sysinternals/bb897553.aspx

    Here is a batch file that that uses it to execute an SSIS package on a remote server:

    @echo off
    echo Enter Username
    set /p UserName=
    psexec \\10.20.30.40 -u %UserName% -e dtexec /sql "\Maintenance Plans\SSIS_Package_Demo"
    pause
    exit

    The only problem with psexec is that you have to run it using a username and password with full administrative rights on the operating system of the server where SQL Server 2005 and the SSIS packages reside.

    That's not a minor problem in any type of secure envirnoment.

    It was a deal-killer for us.

    However, rexec is a similar utility that authenticates the user name on the remote computer before executing the specified command.


    http://technet.microsoft.com/en-us/library/bb490989(TechNet.10).aspx#

    The
    rexec client is included with Windows 2000/XP/Vista, but they do not come with a rexecd daemon to provide the service.

    Winsock does have a 3rd party (non-free)
    rexecd Windows daemon, however...

    http://www.denicomp.com/rexecdnt.htm

    ...and when you use it "
    programs are executed on the Windows system in the security context of the user specified in the rexec command."

    Here is a command line that runs the same package:

    rexec 10.20.30.40 -l username dtexec /sql "\Maintenance Plans\SSIS_Package_Demo"

    There are no simple, secure ways to remotely execute SQL Server 2005 SSIS packages short of purchasing extra SQL Server licenses.

    A remote execution service like rexecd
    is an absolute necessity for running SSIS packages. It's well worth the $44.95 license fee.

    Friday, 21 November 2008 10:14:06 (Central Standard Time, UTC-06:00) | Comments [0] | SQL Server |  SSIS#
    Tuesday, 18 November 2008

    SSIS, Oracle, SQL and 64 BIT - NOT Perfect together !

    Recently ran into an issue where we use SSIS to pull data from Oracle and issues with converting a Oracle Numeric data type to SQL Server.  This did not go well, final corrections seemed ok, until we deployed them.

    There are several solutions to this problem, it seems that having the right drivers of course helps, and so does casting / converting the data while brining it in from Oracle. 

    I found a great blog entry on this, i've copied it below (as great links often disappear), and also the link:

    http://blogs.microsoft.co.il/blogs/bei/archive/2008/10/18/ssis-64bit-using-oracle-provider.aspx 


    SSIS 64bit – Using Oracle Provider

    Hello everyone, running an Oracle Provider over a 64 bit system could be a pain in the…
    The purpose of this document is to explain the main issues running an SSIS solution using Oracle as a data source under 64bit. This document would explain the problems. Offer solution and work  around procedures.


    Symptoms:
    When dealing with Oracle as a data-source we face two known issues:
    1.            The case of the Numeric Type Casting:
                    Oracle uses a data –type named numeric for numerical data.
    the numeric data 
                    
    type  usually requires   setting precision format e.g. numeric(18,2). 
                    While the OLEDB   Provider can handle the numeric type. It does so only when precision is set.when 
                   
    facing a meta-data of type numeric where precision wasn't set. an exception is thrown requires us
                    to use casting within the Oracle query. This is not to terrible if you have single table/query. But try 
                    to imagine maintaining hundreds of queries ?

    2.            Provider  Data rate:
                    The purpose of 64bit hardware is to accelerate the performance.
                    Using a 32 bit provider  on a 64bit hardware limit the performance of the package. (OLEDB              
                   
    Provider uses COM objects and x86 type system).

     

    Solution Scenario:

    The Following solution would handle both issues. The proposed solution is to download and setup a 64 bit generic provider. The only question is which provider and what scenario ?

     Prepare Step:  Remove Previous Oracle Client and or Installations

    1.       Using Oracle Install to remove all Oracle objects (including ROOT)

    2.       Remove specific Oracle Assemblies from GAC:
    Look under %WindowsDir% for the Assembly folder and remove all Oracle specific Assemblies (Mainly Oracle Policy and Oracle Data Provider).
    beware not to remove Microsoft generic assemblies (System.* or Policy..Microsoft)

    3.       Remove Oracle Registry Entries4.       Delete Oracle Folder (remaining items).


     Phase 1: Upgrading to 10g (64bit) 10.2.0.3 - Provider
    The following phase would increase your provider throughput and allow you to execute a 64bit calls to the Oracle Hardware, However it wouldn't solve your Numeric casting issue.
    This step is supported by two 64bit Hardware schema:

    1.           
    For x64 base Hardware:
                    Download and Install the specific provider for x64.
                   
    http://www.oracle.com/technology/software/tech/windows/odpnet/64-bit/index.html

     

    2.            For IA64 based Hardware (Both IA64 and Itanium 2):
                    Download and install the specific Itanium provider.
                   
    http://www.oracle.com/technology/software/tech/windows/odpnet/64-bit/index.html

                    Remark:               Keep in mind that specific Itanium Processors, such as Montecito                                             requires additional patch offered on the same link!.         
                   


    Phase 2:  Establish an Oracle Connection
    1.            Use your Oracle DBA to establish your TNS_ADMIN Settings and set your             "tnsnames.ora" file.
    2.            Use the Registry to set a TNS_LANG with equivalent settings as Oracle Server.

     

    Phase 3:  Upgrading  to Oracle Data Access Components (ODAC) 11g  - 11.1.0.6.21
    This phase would solve the need to cast the SQL query to support the numeric data-type. By using the Oracle Managed Provider for .NET. a Package can be developed without the need to cast the numeric field. Performance Test shows and estimated throughput of 1Milion rows per minute.

    Important Remarks:
    1.            Do not remove previous version of 10.2.0.3 provider!
    2.           
    This phase only applies to x64 Type System. No Itanium Support at this time!!
                    Itanium users, would have to solve the numeric casting problem using casting at            the sql origin.

    Users of the x64 hardware, please download and install the following component:
    Oracle 11g ODAC and Oracle Developer Tools for Visual Studio

    http://www.oracle.com/technology/software/tech/dotnet/utilsoft.html

    Phase 4:  Establish an Oracle Connection
    Since the 11g ODAC usually setup an additional Oracle Root , we need to establish additional Oracle Setting for the additional Oracle 11g Root.

    1.            Use your Oracle DBA to establish your TNS_ADMIN Settings and set your             "tnsnames.ora" file.
    2.            Use the Registry to set a TNS_LANG with equivalent settings as Oracle Server.

    3.            The Registry Setting for the 11g provider , would appear under the WOW64 entry  within the registry.

     Which Provider To Choose:

    Using the SSIS designer within visual studio, be sure to choose the following provider:

    Oracle Data Provider for .NETPlease do not be mistaken with the Oracle Client Provider for .NET which is a Microsoft generic provider!!!!!Q. I can't see the Oracle Data Provider For .NET within the List ?A.  Please do the following:
    1. Download the Oracle 11g ODAC and Oracle Developer Tools for Visual Studio for x86 (32 bit) and 
      extract the assemblies files.
    2. Using the GAC utility to register assemblies
    3. Go to machine.config of the 64bit which could be located at:
      %
      %WindowsDir%
      \Microsoft.NET\Framework\x64\v2.0.50727\Config
      And look for the Oracle Data Provider which is located in the "<system.data>" section.
      Copy the settings (alter to match the public key of the x86 assemblies) to the machine.config file of 
      the x86 settings.
    4. restart the server 

    Do not forget to execute SSIS package using either SQL Server Job (Execute SSIS package step), or by using the dtexec command line.

     Have fun,

    Eran

    Rob Orchiston said:

    Hi Eran, I just stumbled across your post here while looking for something unrelated on google. I spent several hours trying to get thru to a microsoft technical representative in Nth America to get some help with 64 bit providers on Oracle. The MS call routers gave me the run around and I never actually talked with an engineer, although my MSDN subscription allows for several free support calls.

    As for the 64 bit Oracle Providers on our 64 bit Windows Server we couldn't get the MS or the Oracle provider for Oracle working and on Friday ended up conceeding so today the Server is being rebuilt as a 32 bit Windows Server. Our test environment is running 32 bit and SSIS packages work just fine there. I had spent several days trying to get the 64 bit providers to work but we have a production deadline and a hosted server which made resolving the issue more problematic.

    # October 20, 2008 4:39 PM

    Eran Sagi said:

    Hi Rob,

    I would like to help; could you give me more details?

    The Scenario in the Blog is the result of multiple solutions all are working!

    From your comments it looks like you main problem was detecting/using the providers! Here are some thoughts on that issue:

    1. Did you manage to use tnsping to test if the

      oracle server is connected ? (You can also check

      Using PL/SQL or Toad)

    2. Most of the 64bit providers, would work only in

      Genuine 64bit environment, meaning, you have to

      Use either the dtexec command line utility or the

      SQL Server Job (Under 64bit Instance).

    3. Using Visual Studio and/or DTExecUI would only

      Activate the 32bit providers. Those providers

      Doesn't include the Oracle Data Provider for .NET,

      Unless specifically installed both 32bit and x64

      Version!

    4. Did you manage to perform "Test Connection" using

      the Data source Windows of the SSIS ?

    Keep in mind the running a 32bit SSIS solution would limit you to a 3GB memory space (when using the /3GB settings of the system). and limited support for multi threaded. the performance gaps are usually high!

    It is recommended to try harder to achieve 64bit environment.

    Since you have contacted your Microsoft Local, try to relay the Call to MCS Israel, maybe we can help here!

    Regards,

    Eran

    # October 21, 2008 4:15 PM

    Mark Frawley said:

    Hello Eran

    I am having this exact problem.  Before I embark on the extensive things you recommended, I have a few questions I hope you can answer:

    1 - just before "Phase 4..." you say to download and install "Oracle 11g ODAC and Oracle Developer Tools for Visual Studio" but don't say anything about 32 vs 64 bit.  Later, in answering what to do if the "Oracle Data Provider For .NET" does not appear in the list, you specifically say to install the 32 bit version.  Is the overall conclusion that both are to be installed ?

    2 - in your response to Rob, #2, where you say most 64 bit providers will only work in a genuine 64 bit environment and mention using dtexec or an Agent job, are you suggesting that you cannot execute packages with a 64 bit Oracle provider from within BIDS ?

    Thanks for a very helpful post...

    Mark

    # November 17, 2008 2:36 PM

    Eran Sagi said:

    Hi Mark,

    Well you understood corectly:

      a. You should install Both the 32bit and the 64bit

         versions. (require geniune x64 environment).

      b. 64bit SSIS Environemt is enabled during the

         execution of dtexec and SQL Server Agent job

         (64 bit version of SQL). Running the package

         from within the BIDS would result launching a

         32 bit environment.

    I hope that helps,

    Eran

    Tuesday, 18 November 2008 15:03:50 (Central Standard Time, UTC-06:00) | Comments [0] | SQL Server |  SSIS#
    Friday, 15 August 2008

    Working with SSIS, a stored procedure was recently changed from a simple select statement to include some more procedural statements that utilized temporary tables.  Surprisingly the SSIS package stopped working, even though the result set from the stored procedure remained the same.  One of the reasons to use the stored procedure in ssis, was to avoid embedding the login in the SSIS packages related to data access, what gives ?

    Error is:


    The component is not in a valid state. The validation errors are:
    Warning at {6D024AC3-A299-4316-B8D6-BF5A7006768A} [OLE DB Source [1]]: The external metadata column collection is out of synchronization with the data source columns. The "external metadata column "employee_office" (36)" needs to be removed from the external metadata column collection.

    Do you want the component to fix these errors automatically?



    Some Quick research indicates this error is exactly what it says, metadata.  SSIS needs to know the metadata, for stored procedures, using an ole db data source, it does this by querying the first statement in the stored procedure, if the first statements are temporary tables and not a select statement , you've got the problem.   I've seen instances where temporary variables are ok and don't cause the problem.

    I've seen several solutions to this problem using "set fmtonly off" in the SSIS call and "set nocount on" in the stored procedure.  While "set nocount on" is usually recommended, i'm not too pleased with the "set fmtonly off" solution, as I've read issues where this may or may not work and cause performance issues.

    There seem to be several different ways to fix this, each of them slightly different, my preference is option 3. 

    1.  Put "set nocount on" in the stored procedure, put "set fmtonly off" before the execution of the procedure [in the ssis call]. 

    2.  re-write the stored procedure to use table variables instead of temporary tables, a good solution as well.

    3.  Add some meta-data and the "set nocount on" to the stored procedure with a "short circuited if clause" (if 1=0), and a bogus select statement at the top.  I've tested with trying to leave the "set nocount on" out and it did not work.

    CREATE PROCEDURE [dbo].[GenMetadata]

    AS

        SET NOCOUNT ON

     

        IF 1 = 0

            BEGIN

                -- Publish metadata

                SELECT  CAST(NULL AS INT) AS id,

                        CAST(NULL AS NCHAR(10)) AS [Name],

                        CAST(NULL AS NCHAR(10)) AS SirName

            END

     

        -- Do real work starting here

        CREATE TABLE #test

            (

              [id] [int] NULL,

              [Name] [nchar](10) NULL,

              [SirName] [nchar](10) NULL

            )

     


    References:

    http://www.delphi.co.za/PermaLink,guid,56a56e23-d9d5-44dd-9f0b-0f8b80d0b462.aspx

    http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=572199&SiteID=1

    http://scotta-businessintelligence.blogspot.com/2007/05/ssis-ole-db-source-component-stored.html

    http://blogs.conchango.com/jamiethomson/archive/2006/12/20/SSIS_3A00_-Using-stored-procedures-inside-an-OLE-DB-Source-component.aspx

     

    Friday, 15 August 2008 11:56:35 (Central Standard Time, UTC-06:00) | Comments [0] | SQL Server |  SSIS#
    Thursday, 31 July 2008

    SSIS, Excel and 64 Bit SQL Server, wonderful together!

    The connection manager "DestinationConnectionExcel" will not acquire a connection because the package OfflineMode property is TRUE. When the OfflineMode is TRUE, connections cannot be acquired.

    We've seen similiar errors to this before and I blogged something about, blah blah blah, with a solution like re-installing client tools, before trying that, try this, it's an easier fix and may work.  This was found by Tom Reeves, an excellent SQL Server DBA !

    Register 32bit dtexec

    I ran into this issue yesterday and once in the past and I finally figured out how to fix it quickly. The issue occurs on a 64bit server trying to use the 32bit dtexec command. Typically the package will have an excel connection in it or some another type of object that doesn’t have a 64 bit driver. When you install the 64bit integration services it registers the 64 bit version of DTEXEC, which in turn unregisters the 32 bit version. To fix this you have to register the 32 bit version again. You can do this by running the below code from the server with the issue.

    %windir%\syswow64\regsvr32 "%ProgramFiles(x86)%\Microsoft SQL Server\90\dts\binn\dts.dll"

    Thursday, 31 July 2008 07:27:01 (Central Standard Time, UTC-06:00) | Comments [0] | SQL Server |  SSIS#
    Wednesday, 11 June 2008

    We were trying to setup an SSIS Package on a SQL Server, when we would try to open one of the Data Flow tasks we would get the following error:

    The designer could not be initialized. (Microsoft.DataTransformationServices.Design)……..

    There was more to the error, but I didn’t get a screen shot.

    Basically here’s what happened:

    Systems affected: 64bit SQL 2005 SP2

    When we installed SP2 it uninstalls the 32bit DTS.dll file and replaces it with the 64 bit version. The BIDS tool uses the 32bit version. So the fix was to re-registar the 32bit dll using the following command: %windir%\syswow64\regsvr32 "%ProgramFiles(x86)%\Microsoft SQL Server\90\dts\binn\dts.dll"

    Here’s the link to the Microsoft article:

    http://support.microsoft.com/kb/919224

    Wednesday, 11 June 2008 13:57:14 (Central Standard Time, UTC-06:00) | Comments [0] |  SSIS#
    Tuesday, 13 May 2008

    Team working an issue with an SSIS package failing on a new installation of SQL Server.

    Not sure why, but  basically we had an existing instance on a server.  We had capacity to install a second instance on the server.  We installed and prepared the new instance, all works great.

    We have an administrative DTS Package that pumps data to an excel spreadsheet and emails the data.

    Could not get the package to run, received the below error.  Note the error in green, this option is set to false, we tried all kinds of items to change it with no luck.


    Message
    Executed as user: VSQLCRM\SYSTEM. ...ute Package Utility  Version 9.00.3042.00 for 32-bit  Copyright (C) Microsoft Corp 1984-2005. All rights reserved.    Started:  10:27:21 AM  Error: 2008-05-13 10:28:52.79     Code: 0xC0014019     Source: Tracing_SSIS      Description: The connection manager "DestinationConnectionExcel" will not acquire a connection because the package OfflineMode property is TRUE. When the OfflineMode is TRUE, connections cannot be acquired.  End Error  Error: 2008-05-13 10:28:52.79     Code: 0xC00291EC     Source: Drop Baseline Tab Execute SQL Task     Description: Failed to acquire connection "DestinationConnectionExcel". Connection may not be configured correctly or you may not have the right permissions on this connection.  End Error  Warning: 2008-05-13 10:28:52.79     Code: 0x80019002     Source: Populate Baseline Tab      Description: SSIS Warning Code DTS_W_MAXIMUMERRORCOUNTREACHED.  The Execution method succeeded, but the number of e...  Process Exit Code 0.  The step succeeded.

    Only solution was to uninstall Client components and re-install them.

    Not sure why this works, but somehow the installation of the 2nd instance, and applying the associated service pack 2 and CU 6 caused this.  It's also possible that the automated push install from Polyserve somehow causes this, though all the Polyserve push is an un-intattended install setup script, so the bug would be with that utility that is run through Polyserve.

    Lost 4+ hours worth of work chasing this bug down.

    Tuesday, 13 May 2008 14:42:03 (Central Standard Time, UTC-06:00) | Comments [0] | Polyserve |  SSIS#
    Friday, 11 January 2008

    Error: 2008-01-10 21:03:38.38
       Code: 0xC002F210
       Source: Create DB_Stats Worksheet Execute SQL Task
       Description: Executing the query "CREATE TABLE `DB Stats` (`StatDate` DateTime, `DayTotal` Long, `DBA` Long, `master` Long, `msdb` Long, `EAM` Long, `ASPState` Long, `EAM_Load` Long, `tempdb` Long)" failed with the following error: "Too many fields defined.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.
    End Error
    Warning: 2008-01-10 21:03:38.49
       Code: 0x80019002
       Source: Create DB_Stats Tab
       Description: SSIS Warning Code DTS_W_MAXIMUMERRORCOUNTREACHED.  The Execution method succeeded, but the number of errors raised (1) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
    End Warning

    Anyone ?

     

    Friday, 11 January 2008 09:41:55 (Central Standard Time, UTC-06:00) | Comments [0] |  SSIS#
    Search
    Popular Posts
    Unpatched Vulnerabiltiy discovered ...
    Spring Fornicator brewed...
    DTA - Failed to initialize MSDB dat...
    SQL Server Security, not where it n...
    Check the Uptime of a Windows Serve...
    Recent Posts
    Archive
    May, 2017 (2)
    April, 2017 (1)
    March, 2017 (1)
    February, 2017 (1)
    December, 2016 (2)
    October, 2016 (2)
    September, 2016 (1)
    August, 2016 (1)
    July, 2016 (1)
    March, 2016 (2)
    February, 2016 (3)
    December, 2015 (4)
    November, 2015 (6)
    September, 2015 (1)
    August, 2015 (2)
    July, 2015 (1)
    March, 2015 (2)
    January, 2015 (1)
    December, 2014 (3)
    November, 2014 (1)
    July, 2014 (2)
    June, 2014 (2)
    May, 2014 (3)
    April, 2014 (3)
    March, 2014 (1)
    December, 2013 (1)
    October, 2013 (1)
    August, 2013 (1)
    July, 2013 (1)
    June, 2013 (2)
    May, 2013 (1)
    March, 2013 (3)
    February, 2013 (3)
    January, 2013 (1)
    December, 2012 (3)
    November, 2012 (1)
    October, 2012 (1)
    September, 2012 (1)
    August, 2012 (1)
    July, 2012 (4)
    June, 2012 (3)
    April, 2012 (1)
    March, 2012 (3)
    February, 2012 (3)
    January, 2012 (4)
    December, 2011 (3)
    October, 2011 (2)
    September, 2011 (2)
    August, 2011 (8)
    July, 2011 (4)
    June, 2011 (3)
    May, 2011 (3)
    April, 2011 (1)
    March, 2011 (2)
    February, 2011 (3)
    January, 2011 (1)
    September, 2010 (1)
    August, 2010 (2)
    May, 2010 (2)
    April, 2010 (3)
    March, 2010 (1)
    February, 2010 (4)
    January, 2010 (1)
    December, 2009 (3)
    November, 2009 (2)
    October, 2009 (2)
    September, 2009 (5)
    August, 2009 (4)
    July, 2009 (8)
    June, 2009 (2)
    May, 2009 (3)
    April, 2009 (9)
    March, 2009 (6)
    February, 2009 (3)
    January, 2009 (8)
    December, 2008 (8)
    November, 2008 (4)
    October, 2008 (14)
    September, 2008 (10)
    August, 2008 (7)
    July, 2008 (7)
    June, 2008 (11)
    May, 2008 (14)
    April, 2008 (12)
    March, 2008 (17)
    February, 2008 (10)
    January, 2008 (13)
    December, 2007 (7)
    November, 2007 (8)
    Links
    Categories
    Admin Login
    Sign In
    Blogroll