T-SQL (SS2K8) :: Searching For Job Logging Options?
Mar 11, 2015
Is there any sql script using which we can find out all local paths used in job steps and output_file_name path used in advanced settings in in each job step.
Example ) if i have a job which has job steps which out file logging and I need display those jobnames,step id, and command where local paths are used. The reason I wanted to find this output because as we are going for server migration and so we need to create those paths and folders on the new server. I don't want to visit each n every job and look for local paths. That's where i am in search of the script.
Hi, What SSIS logging option should I check to log the success of each component? In log file, I'd like to see what component is currently SSIS executing and what component(s) passed the execution. Thanks
I would like to create an index with a few options but having issues trying to parse and run it successfully. The issue is near the end of the statement. Here's the create statement below:
-- Index PSAPSACTIVITYDEFN on table PSACTIVITYDEFN IF EXISTS(SELECT 1 FROM sysindexes si INNER JOIN sysobjects so ON so.id = si.id
I want to create a function that searches for allowed characters within a table range (that contains the allowed characters) and replace any characters outside this range with a space.
For example -
'Bill123?', 'Jones12.z-' 'John&12/', 'QWERT123&4'
Wanted results – the single quotes are there to show the space for the replaced characters.
'Bill123 ' 'Jones12.z ' 'John&12 ' 'QWERT123 4'
Example SQL data
CREATE TABLE [Common].[AllowedCharacters] ( [Character] [varchar](1) NOT NULL, [Replacement] [varchar](10) NULL, [AlwaysInclude] [bit] NOT NULL) GO SET ANSI_PADDING OFF
[code]....
The function will wrap around the column names and I know it can be done without a table validate the characters but it must be done this way.
I am developing a package on my local workstation. I have defined two logging service providers. One is for SQL Server and the other is for the Windows Event Log. I am using the Dts.Log method in a script task to write log entries.
Logging is working properly with the SQL Server provider and rows are being inserted into the sysdtslog90 table. However, the only events that are being logged in the Windows Event Log are the package start and end events which I believe SSIS is doing automatically anyway.
Is there something I need to do to enable WIndows Event Log logging other than defining a log provider and making sure it is checked active? Won't SSIS write to two different logs with one Dts.Log call? Any ideas on what might be going wrong with my approach?
Hi, I decided to use the SQL Server log provider to store logging data of all my Integration Services packages. I also created some reports about this data for operating purposes. I have a problem occurs the name of the executing package is not always written to the log,but the name of the single task which failed. But that is not very useful information for operating, because I do not see any chance to get the name of the package by the information which is logged in the sysdtslog90 table in the database which I defined for SSIS Logging.
How do I configure the package to always log the package information into the table, too?
I recently read the project real ETL design best practices whitepaper. I too, want to do custom logging as I do today, and also use SSIS logging. The paper recommended using the variable system::PackageExecutionId to tie the 2 logging methods together.
Very often, when I generate SQL scripts for a table, I forget to go to Option tab to click the pk, default, index boxes. Is there a way to permernatly set the whole sql server generate sql scripts options ONCE?
HiI have created a Sql Script through Enterprise Manager for Drop acolumn. By default its creating lot of 'SET' commands. I doubt allthese SET options are required or not. Pls comment on this issueBEGIN TRANSACTIONSET QUOTED_IDENTIFIER ONSET TRANSACTION ISOLATION LEVEL SERIALIZABLESET ARITHABORT ONSET NUMERIC_ROUNDABORT OFFSET CONCAT_NULL_YIELDS_NULL ONSET ANSI_NULLS ONSET ANSI_PADDING ONSET ANSI_WARNINGS ONCOMMITBEGIN TRANSACTIONALTER TABLE EmployeeDROP COLUMN OrderDetails_IDGOCOMMITDil
I could do with a couple of pointers to the best options to acheive my goal, I'm pretty close with the way I've done it, but I feel there is a more elegant solution out their so your help would be most appreciated.
The problem is finding the best way of moving some SQL Server 2000 changed data into sql server 2005. We are only interested in some tables in 2000 (and sometimes just subsets of those). Because there are quite a few tables and the we want to set up a schedule to run periodically, we chose SSIS. The main reason for this is to utilise a for each loop that pulls each tables name from a one column staging table of table names. (that way we can do more or less comparisons by simply adding and removing from the staging table) Also in this loop, using the table name as a variable, we run an exec sql task along the lines of 'SELECT * from varTable EXCEPT SELECT * from varTable_tracker' which gives us the difference beteween the two tables (where the tracker table is a copy of the data table which is sychronised at the end of the job run). So far so good. Now the tricky bit, EXCEPT only works under 2005, the tables are in 2000 so we ended up having a linked server in 2005 back to the 2000 table. Is there a way of acheiving the same result without involving the linked server - or is there a task (script?) we can run to verify the linked server is up before we excecute the job -we already run checks on Connection Managers to see if they are up but never tried linked servers? Lastly, will performance be an issue
anyone know about the undocument options of DBCC? which options are undomented? i.e. log There are only a few options which are documented in SQL Online Books thank in advance
What are the most critical "dbcc" options that should be run to insure the sanity of database and DBA alike? Do these have to be run in single user mode or can they be run while users are on the system?
An automatic monthly delete has recently grown from 15 to 20 million rows. It is now filling my 70GB T-Log completly. I don't have any space to expand the T-Log. Do I have any options other than reducing the number of rows in the delete?
hey guys, i need your help please. here is the scenario:
1. I need to return a data back to client (result set varies 20-10,000) 2. I only want to show 20 records at a time 3. To get info i need to display i need to join 10 tables
When there are small #s of records it works but when i get over 8000 then it becomes a problem:
1. The first version was: Get all data using big query and return everything back to client and display only 20 at a time (not very proficient). Takes around 15 seconds to view 20 records.
2. Inspired by 4GuysFromRolla (http://www.4guysfromrolla.com/webtech/062899-1.shtml) Use Stored Procedure w/ server side paging logic to get 20 records at the time. I had to pass every filter parameter and stuff. SP had to sort resultset and return only 20 records i need to display. Takes around 5 seconds to view 20 records.
I still think it's slow, i know this is a very broad question but is there any other way to do it, logically?
Hi folks, Recently i've installed a fresh Installation of Opertaing Sys and SQL. Win 2000 server, sp4 SQl 2000 Enterprise, sp3. I am using Domain user service startup for sql and is member of domain admin. I've manually added the user in Local-Admin group of the system. Obviously it's also has Sys-Admin server roles. The problem is; when i use enterprise manager and change any of the server settings; Priority Boost or Fix Memory Allocation for the Sql server, nothing happens, the options dialog box closes and doesn't ask for restarting the server neither does the settings take effect when i restart SQL. However if i change the settings using sp_configure using the same user; it works. i've assigned a fixed memory to SQL but the option "Reserver Physical Memory For SQL" won't work. Couldn't find this option in SP_Configure. Any ideas, what has gone wrong.
This follows on from a query I had a few days back (and for which I was promptly flamed! However, I've got skin like a rhinoseros, so here goes...)
I have a table - ProjectSite - that is pulling information from a two tables (Project and Site). This table contains data regarding which sites are part of which project.
I now want a means of reporting dates against this. The problem is that each project has bespoke milestone dates, so I can't just create columns in ProjectSite. The only solution I can see is to pull each project (and there's quite a few and its corresponding sites into a new table) and then I can create my bespoke columns.
Does this sound like the best viable option, or can anyone suggest another means of doing this?
I am considering the different options for package deployment on the server. Until now, I have found several different ways to deploy packages to the server (File System):
Using the Import option from the Management Studio (only one by one) Using the Deployment Utility (Needs building the whole project. Opens all the packages in debugging mode, cannot deploy to different folders) Using the dtutil by constructing a command line for each package deployment. (complicated) Simply copying the files from the local project folder to the "Program FilesMicrosoft SQL Server90DTSPackages" folder on the server.
Does anyone have any other suggestions for deployment? The 4th seems to be the easiest one, but I seen anybody suggesting such an action. What's the downside of such an action?
I hoipe someone can point me in the right direction here.
I have an application with the following requirements (using SQL CE 2 alas)
A set of tables on the server that need to be imported to the handheld. Using rda, I need to get the modifications to these tables from the server (add/edit/delete) but the handheld will never update these tables.
A set of tables on the server that need to be imported to the handheld. The handheld needs to add/edit existing records, and it needs to get any changes from the server.
A set of tables on the server where the handheld needs to import a subset of the records. It needs to add (but not edit) new records, upload the new records to the server, and download any changed (add/edit/delete) records to the handheld. What tracking options should I use in these 3 cases?
The problem comes in that I need to have some foreign key relationships in the database on the handheld. Since rda munges the names of primary keys (and indexes), I do not know of a good way to add these foreign key constraints. Any suggestions?
Hello all. I am currently doing some research into options for setting up reporting. Right now we have a server on EE that's getting hit a bit too hard by the reports. The budget is currently a bit low, but we already have a second server purchased.
For our reporting, we need data that is up to date within the last 15 minutes (less if possible). Because of the potential size of some of our transactions, I've ruled out log shipping as being too much downtime of the reporting data while the second server is catching up to speed. So, I'm trying to figure out what reporting options I have left open to me.
1) I understand that for reporting purposes, a snapshot must be taken of the mirrored server. Why are reports not able to run directly off the mirror live (or am I mistaken?).
2) Is it possible to mirror from EE to SE (remember, low budget for the second server)?
3) How high is the overhead when doing a snapshot every 5-15 minutes ( I would think it's machine specific, but overall is it pretty quick or prohibitive based on how often the snapshot would be needed)?
4) Is replication perhaps a better option based on how up to date the data has to be? Are there any other options that may be available for near-realtime reporting?
I have been attempting to locate a hosting company that offers SQL Server 2005 in addition to Analysis & Reporting Services but have been unable to find a hosting company which does so without purchasing the entire server. Anyone know of a company?
I have got a small project that requires to feed in a .CSV flat file and load the data into SQL server 2000.
I developed a SSIS package for this and get it working in my computer, but I need to deploy it to customers that they don't have VS 2005 or SSIS installed. May any one of you give me some clues on that?
I played around with the flat file deployment and again it seems only working on my computer as I have everything installed.
Just wanted to get some feedback on this scenario. I will be developing an ASP.net application for our local intranet that employees will have access to. It may also be implmented to allow employees to access this ASP site from home as well. I would be using SQL Server 2000 as the backend DB and wondered what type of licensing would best fit this scenario?
Sorry if this is double posted....seemed to be having some issues with posting.
Hi, I was recently experiencing a slowness when executing stored procedures from a .NET Application, but it went fast when executing from Query Analyzer. Research led me to find that by turning ArithAbort ON that it forces the SQL Server to use the same Execution plan whether the request is coming from Query Analyzer or the Application. My concern now is the effect of ArithAbort. I understand what turning this option does, but I am trying to think of a scenario where turning it on could be bad. Does anyone have any suggestions on what I should be aware of when disabling/enabling ArithAbort or ArithIgnore? Thanks. -Brian
Hello, I am trying to bind a sqldatasource control to the gridview. I have selected the sqldatasource control and specified the connection string, on configure Select statement page under advanced options both the check marks Generate INSERT, UPDATE and DELETE Statements Use Optimistic Concurrency are disabled for me.I have a proper SQL Server database not an express data base, how do i get to generate the InserCommand, EditCommand etc Any help would be great .. relatively new here thanks
I'm writing an Insert form which will write records to a few tables. What I want to know is how do I write multiple answers to one question in different rows in the table but keeping the ID?
For example.
The form has the following fields:
HotelIDHotelFacilities (CheckBoxList)
Now each hotel (in this case) will only have one ID but more than one HotelFacility .
How do I get my table to read...
HotelID
HotelFacility
1
Bar
1
Restaurant
1
Cafe
1
Wi-Fi Access
I presume INSERT INTO tblHotelFacilities(HotelID, HotelFacility) VALUES(@HotelID, @HotelFacility) won't write more than one selected facility?Thanks,Brett
Advanced SQL generations options: generate INSERT, UPDATE, and DELETE statements is all greyed out? in my sql data source control.? I have made a brand new instance with sql server management express....have I missed something?
We have an intranet site (.NET 2.0) that needs to access to a SQL Server 2005. We've created a named user on the server, WebUser, and use it's credentials in the connection string to connect to the server. Due compliance issues, we're being asked to remove all the named users from the server. Now the users will have to connect to the database using their security context. That means all the users need to be in some server role, should be given access to the server, etc which is a security risk and a maintenance nightmare. What other alternatives do we have to solve this problem? Your help is much appreciated.
Is it possible to leave this parameter (max server memory (MB)) on default(2147483647) if on my Cluster I have Oracle, Lotus Notes and some other things running, or I have to calculate the amount of memory SQL Server needs? Thanks for any suggestions!
I need to know if there are any other failover options available for SQL Server/NT, beyond Microsoft Clustered Services. If there are any, which ones are advised?
Hi Often we come across a form with a dropdown list (populated from a lookup table) with "Other" option as well. When user selected "Other" option a textbox appears where he can specify new option(This option is not to be added to master lookup table).
What is the best way to store it in database? I have 2 columns (A and B). 'A' contains Lookup ID if some option other than "Other" is selected, else 'A' has NULL and 'B' field contains the text of Textbox.
But while reporting I face problems as in a report the selected option text (either 'A' or 'B') has to be shown under one column only but SELECT A,B results in 2 columns.
My company uses MS SQL Server for the back end and a Retail specific CRM as the front end. I wish to develop some internal peices of software for our use. I was planning on doing this with access.
my options are:
*Use access as front end and backend *Use access as front end and SQL server as backend (create new DB) *Use other front end and SQL Server as backend.
My question is, what are some good front ends that are availble for reletively small demands? How does Visual Studio come into play?
*Also, I would prefer to be able to create a .exe. I dont think access alows that. I would not want users to be able to go in (or even see) the tables and queries. They should only be able to see the one main menu form at the very least.