Transact SQL :: Can Store File Data Without Streaming On Server

Oct 14, 2015

I need to store file(s) to SQL without streaming / reading at Server. I have created a Web API with AngularJS and SQL. e.g. 

var fileType = httpRequest.Files[file].ContentType;
        var fileStrm = httpRequest.Files[file].InputStream;
        var fileSize = httpRequest.Files[file].ContentLength;
        byte[] fileRcrd = new byte[fileSize];
        var file_Name = Path.GetFileName(filePath);
        fileStrm.Read(fileRcrd, 0, fileSize);

Is it possible to send file data to SQL (in bytes) without streaming / reading at server?I don't want to put a load on server for large files. just read  the data and send them to SQL where SQL will do the streaming and store data as varbinary.

View 4 Replies


ADVERTISEMENT

Using Varbinary(max) And UPDATE .WRITE To Store Data In A Streaming Fashion

Nov 1, 2007

I have the following table:

CREATE TABLE [dbo].[IntegrationMessages]
(
[MessageId] [int] IDENTITY(1,1) NOT NULL,
[MessagePayload] [varbinary](max) NOT NULL,
)


I call the following insertmessage stored proc from a c# class that reads a file.

ALTER PROCEDURE [dbo].[InsertMessage]
@MessageId int OUTPUT
AS
BEGIN


INSERT INTO [dbo].[IntegrationMessages]
( MessagePayload )
VALUES
( 0x0 )

SELECT @MessageId = @@IDENTITY

END


The c# class then opens a filestream, reads bytes into a byte [] and calls UpdateMessage stored proc in a while loop to chunk the data into the MessagePayload column.


ALTER PROCEDURE [dbo].[UpdateMessage]
@MessageId int
,@MessagePayload varbinary(max)

AS
BEGIN


UPDATE [dbo].[IntegrationMessages]
SET
MessagePayload.WRITE(@MessagePayload, NULL, 0)
WHERE
MessageId = @MessageId


END



My problem is that I am always ending up with a 0x0 value prepended in my data. So far I have not found a way to avoid this issue. I have tried making the MessagePayload column NULLABLE but .WRITE does not work with columns that are NULLABLE.

My column contains the following:
0x0043555354317C...
but it should really contain
0x43555354317C...


My goal is to be able to store an exact copy of the data I read from the file.

Here is my c# code:

public void TestMethod1()
{
int bufferSize = 64;
byte[] inBuffer = new byte[bufferSize];
int bytesRead = 0;

byte[] outBuffer;

DBMessageLogger logger = new DBMessageLogger();

FileStream streamCopy =
new FileStream(@"C:vsProjectsSandboxBTSMessageLoggerInSACustomer3Rows.txt", FileMode.Open);

try
{
while ((bytesRead = streamCopy.Read(inBuffer, 0, bufferSize)) != 0)
{
outBuffer = new byte[bytesRead];

Array.Copy(inBuffer, outBuffer, bytesRead);

string inText = Encoding.UTF8.GetString(outBuffer);

Debug.WriteLine(inText);





//This calls the UpdateMessage stored proc
logger.LogMessageContentToDb(outBuffer);
}
}
catch (Exception ex)
{
// Do not fail the pipeline if we cannot archive
// Just log the failure to the event log
}
}

View 7 Replies View Related

Help With Streaming Text File Saved As Image

Feb 8, 2008

Hello everyone.
I have an interesting problem. We have a SQL database with a field that is an image type. In it are records where the image field are text files. What I would like to do is look at these files and stream them line by line and do some processes for each line read.
 So let's say a client machine uploaded a file called myfile.txt into this database.
I would like the asp.net (vb) application tkae this file and read each line and do some processing.
I looked at memorystream and streamreader but just can not figure this out. Can you please help? Thank you Kameron

View 2 Replies View Related

ETL Architecture - Streaming Data Into An OLTP

Oct 15, 2007

Interested in feedback from the SQL grand wizards (and would-be wizards) that haunt these forums.

Let's say you need to constantly stream data into an OLTP system. We are talking multiple level hierarchies totaling upwards of 300 MB a day spread out not unlike a typical human sleep cycle (lower data during off-peak, still 24/7 requirements). All data originates from virtual machines running proprietary algorithms. The VM/data capture infrastructure needs to be massively scalable, meaning that incoming data is going to become more and more frequent and involve many different flat record formats.

The data has tremendous value when viewed both historically as well as in real-time (95% of real-time access will be read-only). The database infrastructure is in it's infancy now and I'm trying to develop a growth plan that can meet the needs of the business as the data requirements grow. I have no doubt that the system will need to work with multiple terabytes of data within a year.

Current database environment is a single server composed of a Dell PowerEdge 2950 (Intel Quad Core 5355, 16 GB RAM, 2 x 73 GB 15K RPM SAS ) with an attached Dell PowerVault MD1000 (15 x 300 GB 10K RPM SAS in RAID 5+0 [2x7] w/hot spare) running Win 2k3 64-bit and SQL Server 2005 x64 Standard, 1-CPU.

I am interested in answering the following questions:

Based on the scaling requirements of the data capture and subsequent ETL, what transmission method would you find most favorable? For instance, we are weighing direct database writes via stored procedures for all VM systems versus establishing processes to collect, aggregate and stream CSVs into a specialized ETL environment running SSIS packages that load data and then call SQL Stored procedures to scrub and prepare for production import. The data will require scrub routines that need access to current production data, so distributing the core data structures to multiple ETL processing systems would be expensive and undesireable.
Cost is very important to the overall solution design. In terms of database infrastructure, how would you maximize business value while keeping cost as low as possible? For instance, do you think there is more value in an ACTIVE/ACTIVE cluster (2 x CPU licenses) where one system acts as ETL and the other as OLTP or would you favor replication of production data from ETL to OLTP or (vice-versa). With the second scenario, am I mistaken in thinking we could get away with a Server/CAL licensing model for the ETL server?.
Are there any third party tools that I should research that would greatly aid me here?


I appreciate all feedback, criticism, and thoughts.

Best Regards,

Shane

View 5 Replies View Related

Transact SQL :: Format To Store Data In Table

May 10, 2015

I would like to know what is the preferred format for form submissions to a SQL table. 

View 5 Replies View Related

Store The Data From A Database Into A File.

Mar 22, 2008

how to use class iDataReader's function getbytes(...............) to read binary data? 
 
using System;using System.Data;using System.Configuration;using System.Collections;using System.Web;using System.Web.Security;using System.Web.UI;using System.Web.UI.WebControls;using System.Web.UI.WebControls.WebParts;using System.Web.UI.HtmlControls;using System.Data.SqlClient;using System.Data.Common;using System.IO;
public partial class uploadFileToDatabase : System.Web.UI.Page{    string strUser;    protected void Page_Load(object sender, EventArgs e)    {      strUser=Request.QueryString["download"];      if(strUser!=null)      {        string strConn="Data Source=hao-pc;Initial Catalog=simpleBBS;Persist Security Info=True;User ID=sa;Password=123;Pooling=False";                SqlDataAdapter sda=new SqlDataAdapter();        DbProviderFactory dbProviderFactory=DbProviderFactories.GetFactory("System.Data.SqlClient");        DbConnection dbConn=dbProviderFactory.CreateConnection();        dbConn.ConnectionString=strConn;        dbConn.Open();        DbCommand dbComm=dbProviderFactory.CreateCommand();        dbComm.Connection=dbConn;        dbComm.CommandText="select * from userFiles2 where fileisn='"+strUser+"'";        IDataReader myReader=dbComm.ExecuteReader();        int strFileLen=0;         if(myReader.Read())        {        strFileLen=Convert.ToInt32(myReader.GetString(3));        FileStream afs;        string fPath;        fPath = Server.MapPath(Request.ApplicationPath) + "\upLoadFiles\" + myReader.GetString(1);        afs = new FileStream(fPath, FileMode.OpenOrCreate, FileAccess.Write);        Byte[] filedata = new Byte[strFileLen];                afs.Write(filedata, 0, strFileLen);                 BinaryWriter abw;                  int iBufferSize=1000;          Byte[] outbyte=new Byte[iBufferSize-1];          long retval;          long starIndex=0;                    abw=new BinaryWriter(afs);          starIndex=0;          int itemNum=4;          retval=myReader.GetBytes(itemNum,starIndex,outbyte,0,iBufferSize);  //when i store  large data into a file it display an error such as "Buffer offset '0' plus the bytes '82452' is greater than the length of the passed in buffer.            abw.Write(outbyte);                        starIndex+=iBufferSize;            retval=myReader.GetBytes(itemNum,starIndex,outbyte,0,iBufferSize);          }              abw.Write(outbyte);            abw.Flush();            abw.Close();            afs.Close();            myReader.Close();            dbComm.Dispose();                   }      }                                      }    protected void Button1_Click(object sender, EventArgs e)    {      string strFileName;      string strFileType;      int strFileLen;      Random ran=new Random();      string strFileISN=DateTime.Now.ToString("yyyyMMddhhmmss")+ran.Next(0,999);      if(FileUpload1.PostedFile.FileName!=null)      {        strFileName=FileUpload1.PostedFile.FileName;        strFileType=strFileName.Substring(strFileName.LastIndexOf(".")+1);        strFileName=strFileName.Substring(strFileName.LastIndexOf("\")+1);        strFileLen=FileUpload1.PostedFile.ContentLength;        string strConn="Data Source=hao-pc;Initial Catalog=simpleBBS;Persist Security Info=True;User ID=sa;Password=123;Pooling=False";                SqlDataAdapter sda=new SqlDataAdapter();        DbProviderFactory dbProviderFactory=DbProviderFactories.GetFactory("System.Data.SqlClient");        DbConnection dbConn=dbProviderFactory.CreateConnection();        dbConn.ConnectionString=strConn;        dbConn.Open();        DbCommand dbComm=dbProviderFactory.CreateCommand();        dbComm.Connection=dbConn;        string dateTime=DateTime.Now.ToString();        Stream fs=FileUpload1.PostedFile.InputStream;        Byte[] filedata=new Byte[strFileLen];        fs.Read(filedata,0,strFileLen);                dbComm.CommandText="insert into userFiles2 (fileisn,filename,filetype,filesize,filecontent,uploadTime) values (@fileisn,@filename,@filetype,@filesize,@filecontent,@uploadTime)";        dbComm.Parameters.Add(new SqlParameter("@fileisn", (object)strFileISN));        dbComm.Parameters.Add(new SqlParameter("@filename",(object)strFileName));        dbComm.Parameters.Add(new SqlParameter("@filetype",(object)strFileType));        dbComm.Parameters.Add(new SqlParameter("@filesize",(object)strFileLen.ToString()));        dbComm.Parameters.Add(new SqlParameter("@filecontent",(object)filedata));        dbComm.Parameters.Add(new SqlParameter("@uploadTime",(object)dateTime));        dbComm.ExecuteNonQuery();        fs.Close();        dbConn.Close();        dbComm.Dispose();        dbConn.Close();      }      }  }
 

View 3 Replies View Related

Store Procedure To Load Data From Flat File To Staging Table Dynamically - Column Metadata

Apr 9, 2015

I am having one store procedure which use to load data from flat file to staging table dynamically.

Everything is working fine.staging_temp table have single column. All the data stored in that single column. below is the sample row.

AB¯ALBERTA ¯93¯AI
AI¯ALBERTA INDIRECT ¯94¯AI
AL¯ALABAMA ¯30¯

After the staging_temp data gets inserted into main table.my probelm is to handle such a file where number of columns are more than the actual table.

If you see the sample rows there are 4 column separated by "¯".but actual I am having only 3 columns in my main table.so how can I get only first 3 column from the satging_temp table.

Output should be like below.

AB¯ALBERTA ¯93
AI¯ALBERTA INDIRECT ¯94
AL¯ALABAMA ¯30

How to achieve above scenario...

View 1 Replies View Related

How Can I Store A Text File Into SQL Server?

Apr 6, 2004

Hi,
Can some one give me some suggest?
Thanks!

View 2 Replies View Related

Transact SQL :: Why Does File Seem Displaced When Send Server Results To A File

Aug 24, 2015

I have the following SQL

IFOBJECT_ID('tempdb..#TempTable_Lockbox_File_Header_Record')IS NOTNULLDROP TABLE#TempTable_Lockbox_File_Header_Record
;
IFOBJECT_ID('tempdb..#TempTable_Lockbox_Batch_Header_Record')IS NOTNULLDROP TABLE#TempTable_Lockbox_Batch_Header_Record
;

[code]....

When I send my query results to a file in SQL Server Management Studio, how come I'm seeing the following in Notepad++? FH   TEST "FH" which I thought should be in a CHAR(2) data column is there but "TEST" seems to start in Column 6...not column 3 as I would have expected. I was expecting... FHTEST.

View 3 Replies View Related

Transact SQL :: Exporting Data To Excel File

Apr 24, 2015

I am curious what the "best practice" is for exporting data programmaticly from SQL Server to Excel.  Is it best to do it straight from SQL Server, or should I do it with in my C# code?  My program is going to pull the data, put in the excel file, then email the file.  So I could write an SP that gets the data and puts it in the file, then have the C# code run the SP and email the file; Or I could have the code do everything, pull the data, export it & email it.

If it is considered better to have the SP do it, why and what is the best way? ROWSET functions?

View 4 Replies View Related

How To Read And Store A Large File From SQL Server 2000?

Feb 23, 2006

hi..
   I want to store a RMVB file to SQL SERVER 2000 ,and read from it,iwant to play the RMVB file in web,the size of the RMVB file is more than 300MB less 1G.the  SQL Field Image can include it.
Now My Quesstion is How can i Store and Read the RMVB file from SQL Server2000?
I used SqlInsertCommand.ExecuteNoquery()  in my program,but it Too slow,ao make a unknown error.
Thank you for your help.

View 6 Replies View Related

Transact SQL :: Bulk Insert Arabic Data From CSV File To DB

Aug 5, 2015

USE TEST 
GO 
/****** BULK INSERT  ******/
BULK
INSERT [Table01]
FROM 'C:empdata.csv'

[code]....

I am using above code to insert csv file data which consist of arabic data as well.  Upload is successful however Arabic field data is uploaded with invalid characters and getting the following error Msg 4864, Level 16, State 1, Line 3...Bulk load data conversion error (type mismatch or invalid character for the specified codepage)

View 15 Replies View Related

Transact SQL :: DB Audit File That Gets Populated With Data As Activity On DB Goes On

Aug 25, 2015

writing a T_SQL query for the following scenario..I have a SQL DB Audit file that gets populated with data as the activity on DB goes on.I have multiple monthly tables setup that the import should go into these monthly tables based on the event_time value in the SQL DB Audit file.all the data Like event_time '2015-08-25 15:59:39.033' should go to SQL table Audit_tbl_Aug2015 Query for reading SQLDB Audit file

SELECT * FROM sys.fn_get_audit_file ('C:ackupAudit*',default,default)
order by event_time desc
GO
--DML for Audit table CREATE TABLE [dbo].[Audit_tbl_Aug2015]( [id] [bigint] IDENTITY(1,1) NOT NULL, [event_time] [datetime2](7) NOT NULL, [sequence_number] [int] NULL, [action_id] [varchar](4) NULL, [succeeded] [bit] NOT NULL, [permission_bitmask] [bigint] NOT NULL, [is_column_permission] [bit] NOT NULL, [session_id] [smallint] NOT NULL, [server_principal_id] [int] NULL, [database_principal_id] [int]

[code]...

View 2 Replies View Related

Transact SQL :: How To Export Data To CSV File On Scheduled Basis

Apr 17, 2011

I have an existing MS SQL database (2008 R2). I have a very simple SQL script. I need to run this script on a daily basis and save the results as a .csv file.

I've tried creating a job in SQL Manager - that didn't work. Then I found a tutorial that had me use MS Visual Studio... couldn't get that to function either.

what program I should be using, and how to go about creating this as a task / job that can run on a scheduled basis?

View 13 Replies View Related

Transact SQL :: Bulk Insert When Data File Is On Network Share

Dec 3, 2015

I am running Microsoft SQL Server 2012 SP on a Windows Server 2008 R2 Standard SP1 box. The SQL Server service is running as a simple windows domain user (nothing special, no admin rights, etc.) I am having some issues with using Bulk Insert when the data file is on a network share when using Windows Authentication. What is known is that the SQL Server service account has access to the network resource, which is shown by logging into SQL Server with a SQL account and doing the Bulk Insert. I also have rights to the files on the share, as shown by the fact that I put the files there. My SQL is in the form of:

Bulk Insert [table name] From '[server][share][filename]' With (FirstRow = 2, FormatFile='FormatFile.xml')

Now, when connecting to SQL Server with Windows Authentication and running the Bulk Insert I get the following error:

Msg 4861, Level 16, State 1, Line 2 Cannot bulk load because the file "[server][share][filename]" could not be opened. Operating system error code 5(Access is denied.).

I found this snip at

BULK INSERT (Transact-SQL)Security Account Delegation (Impersonation), which says, in part (emphasis mine):

To resolve this error [4861], use SQL Server Authentication and specify a SQL Server login that uses the security profile of the SQL Server process account, or configure Windows to enable security account delegation. For information about how to enable a user account to be trusted for delegation.

How to Configure the Server to be Trusted for Delegation, and we tried the unconstrained delegation and I rebooted the SQL server, but it still does not work. Later we tried constrained delegation and it still does not work.

I have verified the SPNs:

C:>setspn adsvc_sqlRegistered ServicePrincipalNames for CN=SVC_SQL,OU=Service Accounts,OU=Users,OU=ad domain,DC=ad,DC=local:        MSSQLSvc/SQLQA.ad.local:1433        MSSQLSvc/SQLDev.ad.local:1433        MSSQLSvc/SQLQA.ad.local        MSSQLSvc/SQLDev.ad.local
I have verified that my SQL connection is TCP and I am getting/using a Kerberos security token.
C:>sqlcmd -S tcp:SQLQA.ad.local,1433 -E1> Select dec.net_transport, dec.auth_scheme From sys.dm_exec_connections As dec Where session_id = @@Spid;2>
gonet_transport auth_scheme------------- -----------TCP KERBEROS(1 rows affected)1>

If I move the source file to a local drive (on the SQL server), all works fine, but I must be able to read from a file share?

View 8 Replies View Related

SQL Server 2008 :: Importing SSIS Package Into File Store And Calling It Within A Job

Oct 9, 2015

I've imported an SSIS package into Management Studio (2008 R2) and I've set up a SQL Server Agent job to call the package but it fails due to error code: 0xc00160aa.

As far as I can tell this is because it is unable to read the location of the package despite it being a file system location within Management studio. Also I can run the package manually within Management Studio, but when I try to call it via the job it fails.

View 0 Replies View Related

ResultSet Streaming

Jan 10, 2008

I am using the SQL Server 2005 JDBC driver. I need to write a piece of code that makes use of the streaming ResultSet. That is as soon as I get the first row, a worker thread should be able to begin processing on it without waiting for the second row to arrive. When the second row arrives, the second worker thread should start processing on this new row without waiting for the third row to arrive and so on. Usually, with a ResultSet, I need to wait for all the rows to arrive first before I can start navigating the rows in the ResultSet. But in my code, I need to start navigating the ResultSet even as more rows are pouring in from the DBServer. How can I do that? Any pointers in this direction will be helpful. Further, I want to know will setfetchsize be of any help here? If I set the setfetchsize value to 1, does that mean that as soon as I get the first row, I can start working on this row of the resultset(viz start navigating the ResultSet) without waiting for the second row to arrive in the ResultSet?

View 1 Replies View Related

Live Streaming Requirements?

Apr 10, 2006

We€™re doing a website for a TV station, and they want to have a live TV broadcast on their website, our experience is basically in CMS and portal solutions, but we never had any experience in live streaming, so this is all new to us, so we€™re looking for a technical and financial proposal with your suggestions and recommendations, this is the details of the project with some of the inquiries we have:
-The signal is SDI (Pal 25 fps)
-The output (online streaming) 15 fps
-The user will have two options to view the movie in:
1. Low: 56K
2. High: 300K
-The video should come within a customized designed page, probably as a code we can place inside an html table€™s cell, not as a link for external link
-We€™re expecting about 200 users for the video each month; each user€™s logon time expected 10 min.
-We have a Red hat Linux on our server that will host the site, and we use php/MySql technology.


Inquires:
1. Why Windows media is better then Real in terms of performance, can you provide us with a comparison table to include it to our client?
2. If we choose Windows media, what are the requirements we need to have, form both; the TV station side, and in our Linux server? And how much would it cost?
4. Is there any hardware required from the TV station side such as DVR, or some sort of encoder, if so; what€™s you€™re recommendations (type/ brand), please note that this is a TV station and it needs a high quality professional hardware
5. How (and this is probably the most ambiguous task)
5.1. Do you broadcast the digital signal coming from the encoder (I guess) to the server?
5.2. And then place this signal on a PHP page? Do you give us a link to embed in our code?
6. If we choose to host the live streaming on a separated server and include the URL in our code, would that still required from our server to support Windows media, or it doesn€™t matter as long as the video streaming is hosted on a Windows streaming server
7. How much disk space does it need to host a live streaming (with the details mentioned above)?

We would really appreciate your comments and suggestions, as this is our 1st time to dealing with live streaming.

View 3 Replies View Related

How Data Store On Server??

Feb 15, 2004

Hi all,
please, show me How to data is store on the server ?
thanks for reading.

View 5 Replies View Related

DB Engine :: Streaming And Table Valued Parameters

Jul 11, 2015

I want to insert lots of data into two tables. For this I want to use table valued parameters and a stored procedure. So, what is the better way for best performance? Using two stored procedures or a single procedure with two parameters? How does SQL Server consume the data if I use a single procedure with two parameters. Is it really streaming the data? I mean, is SQL Server already starting to insert the first rows as soon as it gets it even if the client is still sending more and more rows and than the same with the next table?
e.g. I have a procedure like this:

CREATE PROCEDURE [dbo].[usp_UpdateElements] (
@tvpElementsToInsert As [dbo].[tvpElements] Readonly,
@tvpElementValuesToInsert As [dbo].[tvpElementValues] Readonly) AS
                Begin
                    Insert Into Elements Select * From @tvpElementsToInsert;
                    Insert Into ElementValues Select * From @tvpElementValuesToInsert;
                End

View 6 Replies View Related

How To Store Large Character Data In MS SQL Server ???

Sep 11, 1998

I am developing a simple ASP based form, that stores user info in MS SQL server. I have created a table in the SQL server to store the data and defined the body field with this line: `body char(255)`. The problem is that if the user inputs a string longer then 255 characters it gets choped off. How would you suggest solving this problem? Should I use `text` datatype instead?? Any comments are very appreciated !!!

View 2 Replies View Related

In What Sql Server Data Type Would You Store Security Ids?

Nov 5, 2007

security ids seem to be made up of at least 3 32 bit unsigned numbers and a few smaller numbers. We believe their lengths vary. We dont mind dropping the "S" from the front. What data type do you recommend be used for their storage? We expect only limited joins and user visibility on this column. We may wish to create an index on this column. We think varchar and varbinary are the two major choices.

View 2 Replies View Related

DB Engine :: Store Landset Data In Server?

Jul 14, 2015

what is the best practices to store Landset Data in SQL Server? volume of data should grow up to 100 TB

View 3 Replies View Related

How To Set-up Sql Server 2000 In Win2k3 Server To Store Big-5 Chinese Data

Jul 23, 2005

I am using Windows 2003 Server English Version. I wanna store the big-5data so I install the sql server 2000 as if i install it in the Windows2000 with Server Collation of the Chinese_Taiwan_Stroke_CL_AS.However, the data are stored into the database server in unicodeinstead of big-5 in that of windows 2000 OS.I would like to ask how i can set so that the Sql Server 2000 can storethe big-5 data

View 6 Replies View Related

Transact SQL :: How To Combine Two Store Procedure

Jun 12, 2015

How to combine two or more procedure to make a single query. 

I tried like below:

EXEC PROC1
UNION
EXEC PROC2
UNION
EXEC PROC 3
 
Its throwing invalid syntax. 

View 3 Replies View Related

SQL 2012 :: Max Size Of File Which Store To DB By File Stream

Dec 5, 2014

is there limitation for size of file to store in db by filestream in sql server 2008?or it accept all sizes?

View 1 Replies View Related

Transact SQL :: Include A Condition In Store Proc

Sep 29, 2015

I have an existing store proc with insert and update statements, I want to add a condition at the top of the proc with conditions like;

declare @dayofweek int
set datefirst 1
select @dayofweek = datepart(dw,getdate())
--select @dayofweek
if @dayofweek = 1

[Code] ...

If the condition meets with either of the above 2 conditions, than have to run the actual store proc.

My actual Storeproc in which I need to incorp the above conditions is:

CREATE PROCEDURE [dbo].[Load_Product]
@FileDate date = NULL
AS
BEGIN
DECLARE @EventText varchar(500),
@Rows int = 0,
@RowsTotal int = 0,

[Code] .....

View 12 Replies View Related

Transact SQL :: Split A String And Store As First And Second Part

Jul 17, 2015

I am using SQL Server 2008. I have strings like this:

AB-123
CDW-32
declare @First_Part varchar(3)
declare @Second_Part varchar(5)

I want to split the string with delimiter '-' and store as first part and second part.

I saw few sample functions to split a string but these return table of values.

I simply want first part and second part.

In above examples context:

@First_Part = 'AB'
@Second_Part = '123'

Any simple way to do this?

View 6 Replies View Related

Transact SQL :: Disadvantages Of Column Store Index

Nov 3, 2015

What are the disadvantages of columnstore index in Sql Server 2012

View 4 Replies View Related

SQL Server 2012 :: DB2 Store Procedure Returning Two Data Sets

Oct 13, 2014

A DB2 store procedure returns two data sets, when executed from SSMS, using linked server. Do we have any simple way to save the two data sets in two different tables ?

View 1 Replies View Related

SQL Server 2012 :: Store Tabular Data Having Parenthesis As Column Name Into XML

Nov 9, 2014

We are storing changed data of tables into XML format for auditing purpose. The functionality is already achieved. We are using FOR XML Path clause to convert relational data of tables into XML format.

Now, a table is having column name with '(' . For example name of the column is, ColumnName(). In this case we can not convert into XML using For XML clause. Showing error as,

Column name 'columnName()' contains an invalid XML identifier as required by FOR XML; '(' (0x0028) is the first character at fault.

View 1 Replies View Related

Transact SQL :: Store Date Permanently In Stored Proc

Aug 24, 2015

I have stored procedure which runs on some period, what i want is when it first run i want to store that date permanently or till the next time it runs again, and then i need to take my data from that store proc , where hist_date between (date stored when it ran first time,example, 08/21/2015) and today date.(exampple, 08/24/2015)

Now next time when it runs , date stored should be updated in this case it should be (08/24/2015) .

How can i do this in stored proc, I tried to use temp table  but it didn't work .

View 13 Replies View Related

Transact SQL :: Why Store Procedure Not Return Any Value / Result After Using Exec

Jul 22, 2015

I use new query to execute my store procedure but didnt return any value is that any error for my sql statement??

USE [Pharmacy_posicnet]
GO
/****** Object: StoredProcedure [dbo].[usp_sysconf] Script Date: 22/07/2015 4:01:38 PM ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
ALTER procedure [dbo].[usp_sysconf]

[Code] ....

View 6 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved