Data Access :: Automating A Query
Jun 2, 2015I have a sql query that I would like to run every morning and output into a csv file.Unfortunately, I can't seem to find simple instructions on how to use SQL Server Agent to run that.
View 4 RepliesI have a sql query that I would like to run every morning and output into a csv file.Unfortunately, I can't seem to find simple instructions on how to use SQL Server Agent to run that.
View 4 RepliesWe have an application that requires write settings to reportserver virtual directory for the IUSR account when anonymous is turned on during the install. Once the install is complete, we lock down the IUSR account so that it only has browse access to the virtual directory when enabling anonymous access.
We automate the uninstall and install of our daily builds and I'm trying to figure out if I can automated this process somehow either through command line utility or in vbs.
Your help is greatly appreciated...
Eric
We're trying to put a view of data maintained in desktop Access databases online and into SQL Server.The desktop Access system uses separate databases instead of tables within one database, It's a strange design, but it can't be changed.We have been importing all of the separate databases into a single, new Access database, then upsizing the new databse to SQL Server, then uploading it.This is not going to work long term, because we are stuck with a 250 mB Access database to upsize and upload, when we never need to update more than 2 or 3 of the tables and upload more than 2 mB.We'd like to be able to upload only the tables -- preferably the Access *.mdb's -- that have changed, and then replace the SQL Server tables with the new information. And we'd like to automate it as much as possible, without upsizing Wizardy.I don't know where to even begin looking for information about how this might be done.Any suggestions would be deeply appreciated.- Tinker
View 4 Replies View RelatedI am looking for the best method of automating an INSERT query in SQL Server 2005
Code:
--------------------------------------------------------------------------------
'INSERT INTO SaltInvWhOpen (StockCode, Warehouse, TrnMonth, TrnYear, OpenBalQty, OpenBalCost)SELECT StockCode, Warehouse, Month(GETDATE()) AS TrnMonth, YEAR(GETDATE()) AS TrnYear, OpenBalQty12, OpenBalQty12 * UnitCost AS Expr1FROM InvWarehouse
--------------------------------------------------------------------------------
In older versions I would have used a DTS package (It's been a long time and I'd have had to use a book to guide me through it), but as that is gone now, I am unsure where to even begin looking.
If someone could assist by giving me the current method, and dare I ask, walk me through it, the help would be much appreciated.
Thank you!
Julia
question... i heard a bit about Control Macro Generator but have not found much reading material on it. here at our company we have employees fill out a weekely timesheet and typically have clerks enter the information into the TM.PTA.00 (Project Timesheet with Rate/Amount Entry) and submit it that way. say we have this timesheet in a nice little excel sheet with columns that match the input boxes on the form, how can we get it to automatically input the data instead of having a person sit there and type it all in?
View 3 Replies View RelatedHi,
I have written a Data Processing extension for my application and can deploy it on my development machine no problems. My question is: what is the "correct" way of deploying an extension to an end user's machine? Do I have to write a special program to find and modify the Reporting Server config files and copy the extension over. Surely many developers have the same need so there must be a generic solution to this problem, however, I haven't managed to find one.
I first came across this issue in SQL2000 and I thought/hoped it would be rectified with SQL2005 but it appears not to be (unless I'm missing something).
Any ideas would be greatly appreciated.
Thanks in advance,
Tim
Hi,
I am able to do these actions interactively from SQL 2005 (not developers nor enterprise edition, just using SQL 2005 Mgmt Studio) and want to "script/batch" them so I can have them automatically run at a pre selected time.
First: I am able to delete the table by performing a right click on the table, then click Delete from Mgmt Studio SQL 2005. I verify the table is completely gone with a refresh. (I pulled the code that did this ..... DROP TABLE etc. to Notepad)
2nd: I am able to import the table (again from Mgmt Studio SQL 2005) and have saved this action as a SSIS. Execute the script and "waLa" I have all 17K rows of data. I pulled this create table code into notepad also.
Now I put the code of both of the above actions together (drop table and create table) into one SQL query and execute it. This does not give me the same results of above, instead my table is blank now.
Maybe there is a better way. The business problem I am attempting to solve: I am refreshing the data in a as/400 table weekly. I want that refreshed data to be available in the SQL2005 database without my having to press buttons first thing Monday morning. Can any one help? Thanks in advance.
Below is the Code:
USE [400kas]
GO
/****** Object: Table [dbo].[navar100] Script Date: 09/07/2007 16:09:04 ******/
DROP TABLE [dbo].[navar100]
GO
USE [400kas]
GO
/****** Object: Table [dbo].[Query] Script Date: 09/07/2007 16:12:31 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE [dbo].[navar100](
[CMPNO] [decimal](3, 0) NOT NULL,
[ARTDT] [datetime] NOT NULL,
[AUDDT] [datetime] NOT NULL,
[ARDDT] [datetime] NOT NULL,
[CCUS#] [decimal](6, 0) NOT NULL,
[CCNAM] [nvarchar](25) NOT NULL,
[CUSNO] [decimal](6, 0) NOT NULL,
[CNAME] [nvarchar](25) NOT NULL,
[SHPNO] [decimal](4, 0) NOT NULL,
[ARRCD] [nvarchar](1) NOT NULL,
[AUDUS] [nvarchar](10) NOT NULL,
[INVNO] [decimal](6, 0) NOT NULL,
[CUSPO] [nvarchar](15) NOT NULL,
[REFNO] [decimal](6, 0) NOT NULL,
[COMNT] [nvarchar](10) NOT NULL,
[SHPPO] [nvarchar](15) NOT NULL,
[AMONT] [decimal](13, 2) NOT NULL,
[AMOUNT] [decimal](24, 8) NOT NULL,
[REMAN] [decimal](13, 2) NOT NULL,
[INREG] [decimal](3, 0) NOT NULL,
[INSAL] [decimal](3, 0) NOT NULL,
[TMCOD] [nvarchar](2) NOT NULL,
[CRHLD] [nvarchar](1) NOT NULL,
[CRLIM] [decimal](13, 0) NOT NULL,
[CRDAY] [decimal](3, 0) NOT NULL,
[TCRCD] [nvarchar](3) NOT NULL,
[TEXRT] [decimal](11, 6) NOT NULL,
[R1RGL] [decimal](13, 2) NOT NULL,
[TAXAM] [decimal](13, 2) NOT NULL,
[TFRTX] [decimal](13, 3) NOT NULL,
[TFRGT] [decimal](13, 2) NOT NULL,
[TSPCH] [decimal](13, 2) NOT NULL,
[SPCST] [decimal](13, 2) NOT NULL,
[IRPFT] [decimal](13, 2) NOT NULL
) ON [PRIMARY]
We have the following scenario: We receive CSV files every month for which SSIS packages were built to process the data. The following problems occur from time to time:
1. The structure of the CSV file changed (e.g. column added or removed)
2. There were no footers in the data, but now footers started to appear
3. Date format changed (e.g. used to be mm/dd/yyyy, but became mm.dd.yyyy)
4. Number format changed (e.g. from 2000 to 2,000)
Currently we have person who manually opens each file, and using our "validation document" validates to ensure none of these or similar problems occur. We would like to move away from this manual process if possible. I understand that items 3. and 4. could be caught by loading data into a staging table with VARCHAR data types, and performing validation before moving it any further.
Item 2 is a bit questionable (meaning depending on the footer size SSIS load could fail or not).
Item 1, however, is a sure fail of the SSIS package that directly loads the data into a table.
Thus I feel the two possible options are:
1. Create a custom script that will run through the file, row by row, apply all the necessary validations and report an error or continue if all checks out
2. Use some 3rd party tool to validate the files (semi-manually) before kicking off the SSIS processing.
Is there a script anyone has that will automate the addition of anaccess database to the OBDC datasources in control panel.Thanks
View 1 Replies View Related
Hi, hope someone can help or point me down the right track.
I have to load 50+ tables (all have different file layouts) using a data reader (The source only allows an OBDC connection) into a SQL Server 2005 databases, rather than create 50+ ETL packages that will be identical in process terms is it possible to create a single package that will dynamically re-map the source destination joins. I know I can set the Source and Destination using expressions however how do I ensure that the mappings are updated and for the errors can I automate so that they re-direct? Is this possible using a script task? Or by some other means?
Many thanks for any help
Hello,
I will be getting data in either Excel or Access form on a daily basis. I would like to automate the process of converting this (excel or access) data to a table in an existing SQL database. Since this conversion needs to performed on a daily basis, note that I need to update the table that contains data from the day before.
Is it possible to do this and if it is possible, can someone tell me how to do it.
Thanks in advance,
Joe green
I would like to know recommendations on automate the following:
1. I want to know how to take data from various sql server 2000 data queries and load the data onto excel spreadsheets in an automated method. Some of the data loaded into the excel spreadsheets load detail data and some load data into pivot tables.
Is there any way that the data can be taken from sql server to the excel spreadsheets using odbc connections?
2. Some of the data taken from sql server 2000 database is loaded into pdf files. Is there any way this process can be automated?
Thanks!
I have a view that give me the data of all the batched. Now I am using a query on view to get a single batched data. when I am using direct query it was taking 0 sec but when I am using Through view "select * from myView where batched=2" then its taking 30 mnt.
View 3 Replies View RelatedIs it possible to attach Query Analyzer to an Access database to query data (how?) or must one upsize the MDB into a SQL database for querying?
Thanks,
Al
I am using SQL select query to select MIN from column data but I have 12 columns and want to select MIN of all these 6 columns in one SQL select statement.
I have written as SELECT MIN(temp_sv,humidity_sv,SH1,SH2,SH3,SH4) from production but its not working... How to do this....
Hi Guys,
I am trying to automate a basic task using SQL Server 2005 Express.
Currently I have a query script that I run and then save the results as a CSV file. I need to do this on a daily basis and so I am looking to find out how best to go about this. There are a multitude of third party tools that claim to be able to do this - can anyone recommend this or enlighten me of the best way to set up this automation.
All ideas gratefully received!
I want to knew the sql query to return differences between two tables from two different database.
View 3 Replies View RelatedI'm running the following SQL query from LabVIEW, a graphical programming language, using the built in capabilities it has for database connectivity:
DECLARE @currentID int
SET @currentID = (SELECT MIN(ExperimentID) FROM Jobs_t WHERE JobStatus = 'ToRun');
UPDATE [dbo].[Jobs_t]
SET [JobStatus] = 'Pending'
WHERE ExperimentID = @currentID;
SELECT @currentID AS result
<main.img>
This is the analogous code to main() is a C-like language. The first block, which has the "Connection Information" wire going into it, opens a .udl file and creates an ADO.NET _Connection reference, which is later used to invoke methods for the query.
<execute query.img>
This is the inside of the second block, the one with "EXE" and the pink wire going into it. The boxes with the gray border operate much like "switch" statements. The wire going into the "?" terminal on these boxes determines which case gets executed. The yellow boxes with white rectangels dropping down are invoke nodes and property nodes; they accept a reference to an object and allow you to invoke methods and read/write properties of that object. You can see the _Recordset object here as well. <fetch recordset.img>
Here's the next block to be executed, the one whose icon reads "FETCH ALL". We see that the first thing to execute on the far left grabs some properties of the recordset, and returns them in a "struct" (the pink wire that goes into the box that reads "state"). This is where the code fails. The recordset opened in the previous VI (virtual instrument) has a status of "closed", and the purple variant (seen under "Read all the data available") comes back empty.
The rest of the code is fairly irrelevant, as it's just converting the received variant into usable data, and freeing the recordset reference opened previously. My question is, why would the status from the query of the recordset be "closed"? I realize that recordsets are "closed" when the query returns no rows, but executing that query in SSMS returns good data. Also, executing the LabVIEW code does the UPDATE in the query, so I know that's not broken either.
I have this query in a stored procedure. I will simplify it so that it shows the issue I am having.
I have two tables. One table has a PK called PRSubLineID. the other table has that in a foreign key.
I did a query like this using the IN Statement
SELECT PRSubLineID, PRSUBLINENumber, f2, f3,
FROM PRSUBLINES PRSL
WHERE PRSL.PRSUBLINENumber LIKE '%test%'
AND PRSL.PRSubLineID NOT IN
(SELECT CSL.PRSUBLineID FROM CtrSubLines CSL)
This is supposed to select those items from one table with a PRSubLineNumber containing the work Test which has a primary key that has not been used as a foreign key in the other table. However I am getting an empty resultset back.
But when I comment out the "And" and end the query after '%test%', Like this:
SELECT PRSubLineID, PRSUBLINENumber, f2, f3,
FROM PRSUBLINES PRSL
WHERE PRSL.PRSUBLINENumber LIKE '%test%'
I show a resultset containing three records whose PRSubLineID' s are 2384, 2385, 2386.
But when I add this query to the query window:
Select * FROM CtrSubLines CSL WHERE CSL.PRSubLineID in (2384, 2385, 2386)
I also get a null resultset. So hoping to decipher what was going on I changed the initial query to look like this.
SELECT PRSubLineID, PRSUBLINENumber, f2, f3,
FROM PRSUBLINES PRSL
WHERE PRSL.PRSUBLINENumber LIKE '%test%'
AND PRSL.PRSubLineID /*NOT*/ IN
(SELECT CSL.PRSUBLineID FROM CtrSubLines CSL)
And this still returned a null result set.
Basically this is saying that my PRSubLineIDs are neither included nor excluded in the set of all values of PRSubLineID in the CtrSubLines table. But that is simply impossible.
I'm trying to re-write my database to de-couple the interface (MS Access) from the SQL Backend. As a result, I'm going to write a number of Stored Procedures to replace the MS Access code. My first attempt worked on a small sample, however, trying to move this on to a real table hasn't worked (I've amended the SP and code to try and get it to work on 2 fields, rather than the full 20 plus).It works in SQL Management console (supply a Client ID, it returns all the client details), but does not return anything (recordset closed) when trying to access via VBA code.The Stored procedure is:-
USE [VMSProd]
GO
/****** Object: StoredProcedure [Clients].[vms_Get_Specified_Client] Script Date: 22/09/2015 16:29:59 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
[code]....
I have a client who has SSMS installed on her laptop. She is able to connect to the SQL server via SSMS in the office and query data on the server.
She needs to be out of site often and doesn't have internet access. She asks if the data tables can be "backed up" or saved on her laptop, so she can look at them without worrying connecting to the server. I am not sure if this can be achieved, as SSMS is built for accessing a server, not a desktop. Myself never have this need. If I really need it, I would go to Microsoft Access and create an ODBC connection to the datatables. But this client thinks that Microsoft Access is beneath her.
HowTo: Import data to MS SQL 2008 from password protected Access DB ?
View 2 Replies View RelatedHi,
I have an application that uses following code:
Code Snippet
using System;
using System.Collections.Generic;
using System.Text;
using System.Data;
using System.Data.OleDb;
using System.Collections;
namespace TimeTracking.DB
{
public class sql
{
OleDbConnection conn;
//
//the constructor for this class, set the connectionstring
//
public sql()
{
DBConnectionstring ConnectToDB = new DBConnectionstring();
conn = ConnectToDB.MyConnection();
}
//
//
//
public void UpdateEntry(int ID, string Week, string Year, string Date, string Project, string Action, string Time, string Comment)
{
int m_ID = ID;
int m_Week = (Convert.ToInt32(Week));
int m_Year = (Convert.ToInt32(Year));
string m_Date = Date;
string m_Project = Project;
int m_ProjectID = new int();
string m_Action = Action;
int m_ActionID = new int();
Single m_Time = (Convert.ToSingle(Time));
string m_Comment = Comment;
//
//get the project ID from the database and store it in m_ProjectID
//
OleDbCommand SelectProjectID = new OleDbCommand("SELECT tblProject.ProjectID FROM tblProject"
+ " WHERE (((tblProject.Project) LIKE @Project))", conn);
SelectProjectID.Parameters.AddWithValue("@Project", m_Project);
try
{
//open the connection
conn.Open();
OleDbDataReader Dataset = SelectProjectID.ExecuteReader();
while (Dataset.Read())
{
m_ProjectID = (int)Dataset["ProjectID"];
}
Dataset.Close();
}
//Some usual exception handling
catch (OleDbException e)
{
throw (e);
}
//
//get the action ID from the database and store it in m_ActionID
//
OleDbCommand SelectActionID = new OleDbCommand("SELECT tblAction.ActionID FROM tblAction"
+ " WHERE (((tblAction.Action) LIKE @Action))", conn);
SelectActionID.Parameters.AddWithValue("@Action", m_Action);
try
{
OleDbDataReader Dataset = SelectActionID.ExecuteReader();
while (Dataset.Read())
{
m_ActionID = (int)Dataset["ActionID"];
}
Dataset.Close();
}
//Some usual exception handling
catch (OleDbException e)
{
throw (e);
}
//
//
//
OleDbCommand Query = new OleDbCommand("UPDATE [tblEntry] SET [tblEntry].[Weeknumber] = @Week,"
+ " [tblEntry].[Year] = @Year, [tblEntry].[Date] = @Date, [tblEntry].[Project] = @ProjectID, [tblEntry].[Action] = @ActionID,"
+ " [tblEntry].[Hours Spent] = @Time, [tblEntry].[Comments] = @Comment WHERE (([tblEntry].[ID]) = @ID)", conn);
Query.Parameters.AddWithValue("@ID", m_ID);
Query.Parameters.AddWithValue("@Week", m_Week);
Query.Parameters.AddWithValue("@Year", m_Year);
Query.Parameters.AddWithValue("@Date", m_Date);
Query.Parameters.AddWithValue("@ProjectID", m_ProjectID);
Query.Parameters.AddWithValue("@ActionID", m_ActionID);
Query.Parameters.AddWithValue("@Time", m_Time);
Query.Parameters.AddWithValue("@Comment", m_Comment);
try
{
Query.ExecuteNonQuery();
}
//Some usual exception handling
catch (OleDbException e)
{
throw (e);
}
finally
{
//close the connection
if (conn != null)
{
conn.Close();
}
}
}
}
}
Code Snippet
The update statement is not working in my application, no error in C# and no error in ms-access. When I paste the update query into the ms-access query tool and replace the parameter values (@....) with real values, is will update the record.
What am I overseeing here?
--Pascal
hi all,
is there any query to move certain data from a sql data to access table through query. i am having a requirement where i have to fetch the records from a sql table that falls within a specified range to a ms access table. is this possible through a query.
thanks
Is there a way to specify a MS Access table (or query object) in the select query of SQL Server.
Ex.:
MSAccessTable (in file.mdb)
col1
col2
a1
a2
b1
b2
SQL query in SQL Server:
SELECT col1, col2 into SqlTable from [file.mdb].MSAccessTable;
Thanks,
I have recently upgraded to SQL2014 on Win2012. The Access front end program works fine.
But, previously created Excel reports with built in MS Queries now fail with the above error for users with MS 2013. The queries still work for users still using MS 2007.
I also cannot create any new queries and get the same error message. If I log on as myself on the domain to another PC with 2007 installed it works fine, so I don't think it is anything to do with AD groups or permissions.
We need to insert data/rows from a SQL Server 2014 database into MS Access database. The problem is, there are so many columns (100+) in the table and there are so many insert transactions of this kind (from different tables) that it is not very easy to write the code in VB.NET that lists all column names.
Both the Access and SQL Server tables have the same number of columns and the equivalent data types, so inserting is not really the problem. It's just that is there a way to do an insert statement in T-SQL that does not name all the columns?
Hello.
View 5 Replies View RelatedHi guys,
I've been developing desktop client-server and web apps and have used Access and SQL Server Standard most of the time.
I'm looking into using SQL CE, and had a few questions that I can't seem to get a clear picture on:
- The documentation for CE says that it supports 256 simultaneous connections and offers the Isolation levels, Transactions, Locking, etc with a 4GB DB. But most people say that CE is strictly a single-user DB and should not be used as a DB Server.
Could CE be extended for use as a multi-user DB Server by creating a custom server such as a .NET Remoting Server hosted through a Windows Service (or any other custom host) on a machine whereby the CE DB would run in-process with this server on the machine which would then be accessed by multiple users from multiple machines??
Clients PCs -> Server PC hosting Remoting Service -> ADO.NET -> SQL CE
- and further more can we use Enterprise Services (Serviced Components) to connect to SQL CE and further extend this model to offer a pure high-quality DB Server?
Clients PCs -> Server PC hosting Remoting Service -> Enterprise Services -> ADO.NET -> SQL CE
Seems quite doable to me, but I may be wrong..please let me know either ways
Thanks,
CP
I am trying to create a process to automate the FTP process and then IMPORT to a table. Once Import is complete, I would like to move the file from the FTP server and rename it. I tested the "commands" via CMD and it works. I then created a store procedure with the DOS commands to rename/move the source file. The command executes and deletes the file from the source, however, the copied files to a different destination disappeared. I did a search in my local hard drive(testing using my local folders), but can't find where the copied/renamed file went.
HEre is the portion of the commands:
SET @cmd ='Copy "C:Documents and SettingsegllareMy DocumentsBoxtestData.txt" "C:Documents and SettingsegllareMy DocumentsArchiveCopybox.%random%.bak"'
EXEC master..xp_cmdshell @cmd
GO
Can you please tell me what's wrong with this,
or direct me to a sample store procedure I can model..
TIA
Willie Llarena
Hi team
Could anyone tell me if and how can i automate processing in sql server. Example, i would like to automate reading from a source file, populating it into the respective tables and then running a view to process desired tables. I already did all the above manually. I just want to know if there is a way of automating all this process.
When i am trying to start our hospital software based on SQL server 2000, it shows Following Error.Search Condition is not valid, (DBNETLIB) Connection Open (connect()). SQL server does not exist or excess denied. Due to Fetch data.I run our software in Windows 8.1, while it smothly runs in previous version of Windows XP and 7.
View 2 Replies View RelatedI was wondering if anyone can share the procedure(s) used in setting up an automated MS SQL database backup through tivoli Storage Manager
( 5.2.7 )..........?