SQL Server Admin 2014 :: Allowing Users To Run SSIS Packages Remotely
Mar 12, 2015
I am quite new to SSIS but managed to build a package which imports text files in to SQL. The text files are generated after users complete a manufacturing process on a machine.
The SSIS package is stored in the SSIS catalog and currently a SQL Agent tasks runs every evening to import new files that have been created during the day. Users have now requested the ability to run the import process as soon as they have finished their manufacturing runs as they may want to query the data to looks up stats etc.
What is the best way to do this considering all of the users are not SQL guys and wont have direct logins into the SQL Server or access to SQL Server Management studio. They will have access to the PC where the files are generated, so I ideally I need a batch file which they can just execute to import their new files.
I have seen lots of things on the web about running dtsexec but as the package is stored in the SSIS Catalog, how can I execute this remotely?
We manage some SSIS servers, which has only SSIS and SSIS tools installed on them and not the sql server DB.
SSIS packages and configuration files are deployed on a NAS. We run the SSIS packages through DTEXEC by logging in to the server.
We want to allow developers to run their packages on their own on the server, but at the same time we dont want to give them physical access on the server i.e we do not want to add them into RDP users list on server properties. We want them to allow running their packages remotely on the server.
One way We could think of is by using powershell remoting and we are working on that. But is there any other way or any tool already present for the same.
I'm trying to find out what tables are being used in a Database.
I don't want the last User but the User and the Dates.
I have a script that return the last user but that is not going to work.
The following script returns the last user but not all users and the Login Name:
ITH LastActivity (ObjectID, LastAction) AS ( SELECT object_id AS TableName, last_user_seek as LastAction FROM sys.dm_db_index_usage_stats u WHERE database_id = db_id(db_name())
Our development team wanted to create a database user for each application user in the application and use these for granular data access control, which at first, sounded like a good idea but our initial testing ran into some interesting results.
Our target user base was about 15 million users with an estimated 1% concurrency rate, and finding no MS documentation on an upper limit to the number of users a database can have we began some load testing to see how the database performed. In the hundreds of thousands of users range our test database had a hard time performing well under light loads (even without any concurrent connections).
When we purged the users and reverted back to just a handful of service accounts, performance went back to "normal" under the same loads. I began to wonder if this is a situation where throwing more hardware at the problem would overcome the issue or if there is a practical upper limit to the number of users a single database can handle well.
(There were of course other cons to this arrangement and I certainly was never going to expand the users tree in the object explorer for a database like this, but we thought it a solution worth investigating.)
What is the largest number of users any of you have had in a single database?
I have a requirement to delete all the orphans users for the databases. The issue I am having is with when database principal owns a schema in the DB, User cannt be dropped.
How do I transfer it to DBO in case I am looping multiple databases. This is what I got so far .
declare @is_read_only nvarchar (200) Select @is_read_only = is_read_only from master.sys.databases where name='test' /* This should be a parameter value */ IF @IS_READ_ONLY= 0 BEGIN Declare @SQL as varchar (200)
hi ... i several ssis packages which has to be started as and when required. i want to give users that have no access to the sql server the possibility to start some of these packages (ideally out an an erp system or as web interface) the first idea that came in my mind was to create a webservice on sql server which calls the dtexec with the given parameters, but i have no idea if this is the best solution or if this is possible at all.
I have a job that runs an SSIS package. The job seems to be able to run through the package successfully, but at the end it errors out saying "The binary code for the script is not found. ...". The script referred to is at the beginning part (not the very first step) and should already be run.
The thing is, I can manually run this package on the server or visual studio without any problem. Also this job has been run on a regular basis without any issues on our old SQL 2008. I'm migrating this to Amazon Cloud SQL 2014.
Together with this package are other two very similar ones. They all work fine. I just can't figure out what can be wrong with this one.
i m not able to start the SSIS service on my laptop . IT gives error saying SQL server integretion service 11.0 service on local computer started and then stopped . some services stop automatically if they are not in use by other services or program
i am not able to start SSDT . it gives error
microsoft visual studio is unabble to load this document to desigen integration service package in ssdt , ssdt has to be installed by one of these edition of sql server ; std enterprise,dev,or evloution
i hav installed sql 2012 evolution verison on my local desktop.
We've recently upgraded to SQL Server 2014, and are now using SSIS integrated with Visual Studio. We have a SSIS project which contains about 20 packages which are nested in Sequence Containers and executed concurrently. These packages have been set up as project references.
The problem is that when I press the start button to run the packages, they all light up green reporting completion before the data has finished loading into the SQL database. If I press the stop button without waiting a sufficient length of time, then not all of the data gets loaded. i.e. a certain number of rows will be missing from some of the SQL tables.
If I click through to the individual package items and check the data flow progress while running, some of the data flows appear to hang at a certain number of rows without ever reaching completion. The number of rows indicated in the data flow is incorrect - i.e. it will count up to ~150,000 and stay there indefinitely in the running state, when in actual fact there are ~500,000 rows to load.
To clarify, the main package will show all items green and display the "Finished: Success" message in the log window, however when I drill through to certain packages in the set, they'll be stuck in the yellow running state, with no way of knowing whether they've actually completed or not.
My current workaround is to just wait a certain length of time before pressing the stop button. This bug doesn't seem to inhibit rows being loaded - it just incorrectly identifies the point when the load finishes, causing people to terminate the load prematurely.
This issue only occurs if I run the project from the main package container. If I execute the child packages individually, they correctly report the number of rows being loaded and light up green once complete.
I have SSIS 2012 Enterprise, using catalog deployment and have more that 50 environment variables for connection to databases across my enterprise.
The problem when i go to configure the packages after deployment and pick the proper env variables, that are not sorted, so i have to browse all entries in order to find the proper entry in environment variables.
I want to set up a database role so that users can use sp_readerrorlog through SSMS. It does a check on membership in the securityadmin role.
I have tested it and can see you can grant execute on xp_readerrorlog but the SSMS GUI uses sp_readerrorlog.
I thought I could create a user/certificate and add the signature to sp_readerrorlog but it's not permitted (likely because it's not a normal database object).
So the other solution is to add the users to the securityadmin role but then explicitly deny alter any login (best done with a custom server role in 2012+ but otherwise just manually in 2008). I tested this out and it works, I'm not able to alter any logins or increase my own permissions, I also did a check of what's reported from fn_my_permissions(null, null) and it shows minimal permissions like I'd expect.
We recently moved our SQL Server 7.0 database from one machine to another, and I think the guys who did it didn't do things quite right...
None of the DTS packages work anymore.... the little lines (for lack of better terms here..) that contain the SQL statements and such, well, if I try to edit the properties of them, I get an error that says this: Error Source: Microsoft OLD DB Provider for SQL Server
Error Description:[DBMSSOCN]General network error. Check your network documentation.
My guess is that the things are looking for Server X and now they're on Server Y. Problem is, that when I try to edit the properties of the 'little line'...that error pops up...and then the window that comes up with the tabs across the top of it that lets me enter the SQL statement disappears right away and I can't make the changes?!?
Other question:
The SQL Server is now on a machine that's like 3000 miles away. Right now I use pcAnyWhere to log into it and use the SQL Enterprise Manager there.
Well, I've got SQL Server installed on this machine here...surely there is a way to have this SQL Enterprise Manager here connect to the remote SQL Server so I can play with it that way instead of the pcanywhere mess...right? Seems logical....
I'm trying to set up SQL Server so that people with Enterprise Mgr can create a DB registration to their DB only (sql.yoursite.com). Are there any tutorials out there for doing this?
Developers want to give users the ability to schedule and run processes, which may be executables, T-SQL, DTS, Active Scripting, etc. I don't want to enable a proxy account due to the well documented security problems. Has anyone come up with a good way to do this?
I'm relatively new to administering SQL Server, hence this basic question.
What are the standard practice tool(s)/methods for allowing users access to a database to edit data? In this case, these are engineers (not programmers) that need to edit some values. The practice in the past has been to install Enterprise Manager on their PC. However, this does not seem proper to me and I was thinking more along the lines of having them use Access instead.
I compare Enterpise Manager in SQL Server as an equilvalent to Enterprise Manager in Oracle - these are tools only for use by database administrators. Is this a correct correlation, or is Enterpise Manager in SQL Server a tool that users (and programmers?) are commonly allowed to use? In the case of common users, I would think not.
I would like to allow a particular user to truncate a log file in astored procedure that the user runs every day. At this moment the onlypersonnel that can truncate the log file are personnel with sysadminrights. Is there any way to do this in sql server 2005 withoutgranting this user sysadmin rights (something we REALLY don't want todo)? Thanks for all your help in advance.Dave C.
I'm building a package that will periodically be used to import data from an excel file to update a SQL database. I've done most of it but I am completely stuck on one bit:
How can I build something in that allows the user to browse for the Excel file to be imported (like you do when you set up the Excel Connection Manager and it asks which file)?
My ideal world would be someway of calling just that part of the Import wizard.
There are lots of things about dynamically choosing the file using variables but they don't allow end-user interaction. It may be that I'm trying to do something I shouldn't.........
is this even possible in reporting services? I already deployed reports for our client using Reporting Services 2000, one of their complaints was that the dropdownlist of the reports contains a very long list of data, this list cannot be shorten since all data are required. Is there a way to let the users type in characters, not only one character to find the exact data they want. The data in the drop down list are needed because these data are parameters.
Are there any web based reporting tools which can provide this kind of requirement?
I have a MSSQL Report Services Report Model set up to allow users to create their own ad hoc reports. The data source for my model is a Named Query that queries a MS SQL view that actually pulls data from a series of tables & other views. When you create a report from this model and attempt to filter the data some of the fileds will provide you with a pick list to select which values you would like to filter on and other fields do not provide you with a pick list but require you to enter the data directly that you wish to filter on. Most of the data fields that I am trying to filter on are varchar fields and like I said some will create a pick list for you to selet from and others will not. This all seemed to start after I changed my data source to a Named Query rather than having my data source as the MS SQL View directly. I did this becuase it seemed that anytime I had to make a change to the views that the data source pulled from it would mess up any existing reports that had been created and this does not happen if I use a named query. I have gone crazy trying to figure this one out so any ideas would be greatly appreciated!
Will start supporting mission critical application soon and looking a best way to remotely support it, short of dragging a laptop with me everywhere. Was going to eval PocketDBA. ANy other ideas/solutions?
If I install an instance with Windows Only authentication, and then change it to Mixed Mode, if I enable the sa login, the password has already been set. What is the default? If it's generated, how secure is it? Is the password generated? What algorithm is used for that?
My sql databases in SQL Server 2014 has the status "suspend" as I saw in SQL Management Studio. I can't restore to serviceable condition sql databases through standard procedures. I need to restore .mdf file.
I am using a monitoring system where I can monitor a numeric SQL result assuming the result is one field and one row.I would like to do this to say monitor the free available space or percentage on say the Master database. DBCC SQLPERF gives me a few columns and results for all databases on the server.
In our environment applications are using a DNS name which points to the physical server ip address. Now we are planning to move to 2014. We are planning to have servers in different subnets so we will be having two ip adresses for listener. How we can point the DNS to the listener ips? If failover happens can the DNS point to the exact ip address of the listener where it's primary node?
"Process 0:0:0 (0x1e10) Worker 0x00000006B6D341A0 appears to be non-yielding on Scheduler 13. Thread creation time: 12906028806348. Approx Thread CPU Used: kernel 0 ms, user 0 ms. Process Utilization 13%. System Idle 84%. Interval: 70189 ms."
Is it better to run the profiler or performan counter?
What are the filters we have to select in the profiler to monitor the Sql server