Configuration File Locations - Likes To Say It Can't Load Them In Designer
Feb 28, 2006
I have a package and added a configuration file to it to hold settings such as connection strings etc.
When I open the package up in the designer and check where the configuration file is from, it likes to specify it with a full path like this:
\myserverprojectspackageslah.xml
Where blah.xml is the configuration file and the package file is in the same exact directory.
This creates an issue when I use build and then try to install the package on a server. When installing, the server tries to read the config from \myserverprojectspackageslah.xml which it can not.
Conversely I've tried editing the location in the designer changing it to just be "blah.xml". Then after using the build option I am able to install the package successfully on the server (it can not find the xml file).
However when I then load the package later in the designer to try to edit or chagne it, the designer complains that the "configuration file could not be loaded" and it can't find it even though it is in the same directory that the package definition file is in.
I have been asked by the powers that be to make sure that my configuration database gathers some auditing information. I have looked over these and have no idea where to find the information in SQL Server. If you could tell me the table and database that the informations are located in, I could write the T-SQL to find them. IMPORTANT you do not have to answer all of them, I am grateful to those that pick and choose even one to help me with.
SQL Server Auditing
General/Access Auditing Items
Software Install (DB): Verify that the sample and demonstration databases are not installed and remove the temporary setup files created by SQL Server setup process
Authentication and Access Control
Default user accounts & passwords (DB): Default user account passwords will be changed and will follow the corporate password standard for frequency of change, length and complexity User authentication (DB): Windows authentication is preferred. Mixed mode authentication can be used. User passwords (DB): All user passwords (mixed mode, SQL authentication, Windows authentication) will follow the corporate password standard for frequency of change, length and complexity. Sharing database user accounts (DB): Sharing of database user accounts is not permitted without an explicit exception documented and granted by IS Information Security. Normal users submitting jobs (DB): Normal user jobs should not be run as SA. Create a role-based policy for normal users that must be allowed to run jobs. They should be included in the role created for job submission.
Monitoring and Reporting
Auditing (DB): For all database installs after XX/XX/XXXX, Auditing should be turned on and at minimum set to log failed connections. Logs should be saved to a different hard drive than the one which data files are stored. SQL error log (DB): Processes should be in place to protect error log data so it can be reviewed for seven days prior to being overwritten or deleted from the system.
Networking
Cross-database ownership chaining (DB): Cross-database ownership chaining is disabled. Code of stored procedures, triggers and views (DB): Who has access Public access to SYSXLOGINS and SYSDATABASES tables: The public role will not have access to the SYSXLOGINS or the SYSDATABASES tables. Public access to SYSOBJECTS and SYSCOLUMNS tables: The public role will not have access to the SYSOBJECTS and SYSCOLUMNS tables. Public access to stored and extended stored procedures: The public role will not have access to stored or extended stored procedures. Public access to xp_regread and xp_instance_regread: Restrict public role access to xp_regread and xp_instance_regread. Public group access to mswebtasks table: The public group will not have insert, update, delete or select permissions to the msdb.dbo.mswebtasks table. Temporary directories: For all database installs after xx/xx/xxxx, all temporary directories will be cleaned on a periodic basis. Auditing II: Auditing should be turned on and at minimum set to log failed connections. Logs should be saved to a different hard drive than the one which data files are stored. SQL Server instances visibility: All SQL Server instances should not be visible across the network. They should be set up with the hidden option activated. Default ports: SQL Server should be configured to not use the default ports.
I know I've seen documentation on this but I can't find it at the time. What's the recommended file locations for a SQL install.. System and Data on a RAID drive and logs on a separate drive that's mirrored..? Oh and if anyone has links to this info let me know also.
I am trying to split the log and data files between two drives on a newSQL-Server 2000 installation. I followed the instructions from the MSarticle 224071 (Moving SQL Server databases to a new location withDetach/Attach).Unfortunalty, when I try to move the Master database, as perinstructions, access to SQL-Sever is lost;the service will never startagain. I've tried this twice after re-installing with the same results.Anyone have any ideas as to what is going wrong?Tim
Here is what I am trying to accomplish. I need to move *.pdf files from a local directory into a local staging directory, then from the Staging directory FTP them up to the customers site. Then move the files from the staging directory to an archive directory. I can do this fine as long as all values are static in SSIS, I need help to figure out hoe to do this using variables. The Directories to be used are to be supplied by the DB all that is give in the directory itself all files in the directory need to be moved. Any help would be appreciated. Thank you, Mike
Hi, I have to create a datebase for an assignment and I am using MSSQL and need a little bit of help for something that I don't think can actually be done. Basically I have an artist table. Now each artist may well have a website of their own where their biography can be found. In my artist table I have an artist_bio field where I want the URL of the artists bio to be placed and only the URL. So for example I would want www.iamanartist.com/mybio to be put into that field. So I was wondering if there was a constraint that could do this? I don't think there is because URLS can be so complicated but if someone could tell me whether it is or isn't possible I would be really grateful. I am also wanting to do something similar with the location of a hardcopy of their work. So I have a digital_images table with a location field where the loaction of their work will be on a harddrive. Is this possible or not as well? Cheers for any help you can give, Liam
works fine in designer but when i load the report services I get the following error anybody know what to do there is one subreport with this report maybe the passing value but what could be wrong ????
Item has already been added. Key in dictionary: '9' Key being added: '9'
I've have a simple question. After running the SSIS deployment utility to install a couple of SSIS packages into a SQL Server, is there a way of determining programmatically (at some later stage) where the dependency files (in my case a simple xml configuration file) for the SSIS packages were installed (if the user chose not to install it into the default location).
When I go to open a file in SSMS, the Open File dialog box appears. In the left side of that box, there is a panel containing shortcuts to various locations: Desktop, My Projects, and My Computer. I would like to add some shortcuts to folder that I use in this area, but I haven't been able to figure out how. how to do this?
When adding an SSIS step to a SQL Server Agent job, when selecting the location of a config file, the dialog lets you select from the database server you're working with. If selecting the location of the package itself (when the source is File System), the dialog lets you select from the machine where Management Studio is sitting instead of from the database server. Is that intentional? And if so, why? Should I just use a fully qualified file name for the package location rather than one using a drive letter?
HiOn our SQl Server (2000) we can have queries that take at least 10-20mins(require full table scan fo billion row table). When one of these queriesis running it substantailly slows down very quick queries (sub secondqueries take several seconds).I believe there is no way to set priorities but was wondering if thereare other configuration settings that could help. The server is dualprocessor so maybe setting maxdop to 1 will help. Memory size isdynmaically assigned up to 4Gb but as the DB size is > 1Tb I'm not sureif allowing this much memory is actually decreasing performance when thequick query trys to get through.Any suggestions.ThanksMike
I have this query that is searching using the hotel name as its first parameter and the hotels location (e.g. london) ast is second parameter.
My client wants to search on part of the hotel name and part of the location name (so I used the like statement)
Oh and the location searches using like on 3 location fields.
My client is complaining when he types hilton and leeds he gets all the hilton hotel in Leeds plus a few extra hotels that aint hilton hotels. I think the query is to senesitive for its own good but he's pushing me on this issue so could anyone advise me if this query could be made to only select the hotels with a name like hilton in leeds. I've pasted this query below. Thanks for your time.
By the way one hotel it pulls up which is wrong is Heath Cottage hotel whos location field is leeds.
SELECT * FROM hotel WHERE name LIKE + '%' @Hotelname + '%' AND location1 LIKE '%' + @city + '%' OR location2 LIKE '%' + @city + '%' OR location3 LIKE '%' + @city + '%'
I have a query below that performs horribly:@KeywordOne char(6),@KeywordTwo char(6),@KeywordThree char(6),@KeywordFour char(6),@KeywordFive char(6)SELECTc.SomethingFROMdbo.tblStuff cWHEREc.SomeColumnName = 0AND (c.Keyword LIKE '%' + @KeywordOne + '%' OR @KeywordOne is Null)AND (c.Keyword LIKE '%' + @KeywordTwo + '%' OR @KeywordTwo is Null)AND (c.Keyword LIKE '%' + @KeywordThree + '%' OR @KeywordThree isNull)AND (c.Keyword LIKE '%' + @KeywordFour + '%' OR @KeywordFour = isNull)AND (c.Keyword LIKE '%' + @KeywordFive + '%' OR @KeywordFive = isNull)The contents of column c.Keyword looks like this:Row1: 123456,321654,987987,345987Row2:Row3: 123456,987987etc.What can I do to get this to perform reasonably? I cannot use full-textsearch.Any help is appreciated.lq
I need to search the field containing the word I enter, but this word is bracketed by "{{" and "}}". For example, I would like to search the keyword apple, I need to search by the following sql statement.
select name from table where name likes '%{{apple}}%'
But the case is that that column may contain some space between the bracket and the keyword. i.e. {{ apple }}. How can I write a sql statement to search name from the keyword?
lately I had the following message on win2000 sp4 with sqlserver2000 sp3a: "DTS Designer Error - Specified file not found". Then in the DTS editor the Task menu objects are missing and it is not possible to link a source and a target using Transform Data Task.
Before it used to work fine.
I have done the following:
- installing again last sp: same issue again - reinstalling sqserver with a registry deep cleaning: same issue again - started disabled windows services: same issue again
any idea?
Carsten
edit: Could it be possible that a lately installed fix for Win2K interferes with DTS-Designer?
Hi All there!I am quite new in MS SQL administration so let me explain how it workon Your instances of SQL Servers.We have several DTS packages on our server, all of them managed onsome station which have seriously hardvare problem. So we wolud liketo catch two problems at one time and decided to develop systematicway of DTS manipulation.One of several aspects of this operation would be migration fromsystem ODBC data sources definitions into file ODBC sources ( .dsnfiles) in order to make them easier to manage ( backup for example,and even reusability on other workstations). All .dsn files should belocated on some network resource (\server\directory...) which wouldbe set as default ODBC directory in ODBC administrator on managementstation.When I begin to do so, then it apears that EM DTS Designer does notremember the path to the DSN files ( for example on design panel Ichose file dsn and by browse button point at the certain .dsn file,and then after DTS save the path disapears).Do You use this facility ( file .dsn) in DTS EM Designer, or maybe MShas it treated as usless, and nobody wants to use this?RegardsK
I cannot get the log file path read from the configuration.
If the path in the Connection Manager is invalid, package throws an error "SSIS logging provider has failed to open the log" instead of reading it from the config.file. What am I doing wrong?
Here is the portion of the config file. Everthing else is read from the config file correctly.
I need to import data from csv-file to sdf-database (SQL CE 2.0). When I copy the csv-file to my mobile device, it thakes more than 1 hour. The sdf-File later has a size of 20MB.
If I create an sdf-File (SQL CE 3.0) on the desktop, it just thakes a few minutes.
Is it possible to create an sdf-File (SQL CE 2.0) on the desktop legally? My clients don't want to by the SQL-Server, because it should be an inexpensive solution.
Or is there another way to create the database with more performance?
Hi dear members,Can onyone please tell me that how can we load multiple files/ or evena single .SQL file stored on any physical location(Hard Disk) from SQLprompt.i have written some scripts in diffrent files, now i want to run thosescripts, Do i always need to manually open those scripts and run it onquery analyser or is there some way out.Please HELP!!!
I have a simple SSIS package (stored in the file system) that gets a file path from a configuration file. The configuration type is an indirect XML configuration file that uses an environment variable to store the location of the configuration file.
When I run the package using dtexecui, or just dtexec from a command line, the package successfully picks up the file path from the configuration file (for verification I am writing out the variable containing the path to the log file). However when I run the package from a SQL Agent job it appears that the configuration file is not being used (the path is set to the same dummy path that I used during development). I have tried running the job as both a CmdExec and an Integration Services job and both fail on the same thing (invalid file path).
Both the SQL Server Agent service and the Integration Services service use a domain account as their start up account. This domain account has been included in the local administrators group on the server (in case it was having trouble accessing the environment variables).
What is the problem here €“ surely changing the way in which the package runs should not affect the configuration file settings! Any help would be appreciated€¦
I want to load a table from a file. My file has a fixed length(fixed block) and have the same fields of the table. I need the right sintaxis, because The next with errors:
"load from file1 insert into table1"
ANY ADVICE will be greatly apreciatted because I'm not an expert in databases. Thank you veru much.
I try backup database but this error message pop up: could not load file or assembly 'SqlManagerUi, Culture=neutral, PublicKeyToken= 89845dcd 8080cc91' oe one of its dependencies. Method signature has invalid calling convention. (Exception from HRESULT: 0x80131239)(mscorlib)
I am receiving the following message when I try to create an SSIS project in Visual Studio 2005 Team Suite:
Could not load file or assembly "Microsoft.AnalysisServices.Controls" Version=9.0.242.0, Culture=neutral, Public Key Token = 89845dcd8080cc91 or one of it's dependencies. Strong name validation failed (Exception from HRESULT: 0x8013141A)
I am using Bulk Insert task to laod data from .dat file to SQL table but getting an error below.
[Bulk Insert Task] Error: An error occurred with the following error message: "Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.The bulk load failed. The column is too long in the data file for row 1, column 1. Verify that the field terminator and row terminator are specified correctly.".