Weird Replication Bug. Bulk Data Stream Was Incorrectly Specified As Sorted.
Apr 17, 2006
before anyone even says it, i checked the collation order on everything and it's the same. i get the error when the snapshot is trying to be bulk copied to the subscriber.
i'm on sql2k sp4, server and db collations are SQL_Latin1_General_SP1_CI_AS. here's a repro. 1st, run this in a blank db:
if exists (select * from dbo.sysobjects where id = object_id(N'[dbo].[Event_Transactions]') and OBJECTPROPERTY(id, N'IsUserTable') = 1)
drop table [dbo].[Event_Transactions]
GO
CREATE TABLE [dbo].[Event_Transactions] (
[EventTransactionId] [int] IDENTITY (1, 1) NOT NULL ,
[OrphanedFlag] [bit] NOT NULL ,
[ProcessedFlag] [bit] NOT NULL ,
[ProcessedTimeStamp] [datetime] NULL ,
[EventTimeStamp] [datetime] NOT NULL
) ON [PRIMARY]
GO
CREATE CLUSTERED INDEX [EventTransactions_IDX_ProcessedOrphanedEventTimeSt amp] ON [Event_Transactions] (
[ProcessedFlag],
[OrphanedFlag],
[EventTimeStamp]
) ON [PRIMARY]
GO
ALTER TABLE [dbo].[Event_Transactions] ADD
CONSTRAINT [PK_Event_Transactions] PRIMARY KEY NONCLUSTERED
(
[EventTransactionId]
) ON [PRIMARY]
GO
insert into Event_Transactions (
OrphanedFlag
,ProcessedFlag
,ProcessedTimeStamp
,EventTimeStamp
)
values (
1
,0
,NULL
,'2004-05-07 15:15:24.000'
)
insert into Event_Transactions (
OrphanedFlag
,ProcessedFlag
,ProcessedTimeStamp
,EventTimeStamp
)
values (
0
,1
,'2004-07-08 13:04:01.513'
,'2004-07-07 16:52:08.000'
)
Now, use transactional replication to replicate it to another db taking all the defaults. when the distribution agent tries to apply the snapshot, it fails with the message mentioned in the title..
Has anyone ever seen this? It's keeping us from considering MS replication for one of our major products. Thanks.
View 1 Replies
ADVERTISEMENT
May 22, 2006
I've read the other posts related to this issue, but I'm just REALLY confused as to whats happening in my case. Like everyone else it was working fine in SQL 2000 but now in SQL 2005 there is an issue. I'm calling a stored procedure with parameters defined like this:
@action varchar(10),
@GLTransactionID int = NULL OUTPUT ,
@GLBatchID int = NULL ,
@GLAccountID int = NULL ,
@CurrencyID int = NULL ,
@LocalDebit decimal(28, 13) = NULL ,
@LocalCredit decimal(28, 13) = NULL ,
@BaseDebit decimal(28, 13) = NULL ,
@BaseCredit decimal(28, 13) = NULL ,
@TransID int =NULL,
@Description varchar(255) = NULL
I am calling this proc from VS.NET 2003 using the .Net SqlClient Data Povider (C#). I'm setting the values of the parameters like this:
cm.Parameters.Add("@action", "insert");
cm.Parameters.Add("@GLBatchID", _gLBatchID.DBValue);
cm.Parameters.Add("@GLAccountID", _gLAccountID.DBValue);
cm.Parameters.Add("@CurrencyID", _currencyID.DBValue);
cm.Parameters.Add("@LocalDebit", _localDebit.DBValue);
cm.Parameters.Add("@LocalCredit", _localCredit.DBValue);
cm.Parameters.Add("@BaseDebit", _baseDebit.DBValue);
cm.Parameters.Add("@BaseCredit", _baseCredit.DBValue);
cm.Parameters.Add("@TransID", _transID.DBValue);
cm.Parameters.Add("@Description", _description.DBValue);
When I execute the call to the stored proc I get this:
"The incoming tabular data stream (TDS) remote procedure call (RPC) protocol stream is incorrect. Parameter 8 ("@BaseDebit"): The supplied value is not a valid instance of data type numeric. Check the source data for invalid values. An example of an invalid value is data of numeric type with scale greater than precision."
Using the VS.NET command window I then inspect that parameter to see what the heck is going on and get this:
?cm.Parameters["@BaseDebit"].SqlDbType
Decimal
?cm.Parameters["@BaseDebit"].Precision
0
?cm.Parameters["@BaseDebit"].Scale
22
?cm.Parameters["@BaseDebit"].DbType
Decimal
?cm.Parameters["@BaseDebit"].Value
1000000
[System.Decimal]: 1000000
So I set a decmial parameter to 1,000,000, that parameter in the DB is defined as decimal(28,13) so should fit no problem, but it seems the Sql data provider is confused and thinks 1,000,000 is decimal (0,22)???
View 5 Replies
View Related
Apr 6, 2008
I receive this error when I make a depolyment to our new server(virtual server).
The report works fine in the report manager. In my application, I use RenderStream method to retrieve the images and embed in the webform. I googled it and found some people having the same issue because of the cookie, so they set 'UseSessionCookies' = false in the table ConfiurationInfo of ReportServer database. I tried this, but no luck.
Also, there is a hotfix from Microsoft http://support.microsoft.com/kb/913363.
I have requested a copy, but not sure whehter it's gonna be helpful.
Any clues or suggestions weclome.
Thanks
View 18 Replies
View Related
Jul 23, 2005
For some reason a stored procedure which I have created is incorrectlysaving the date to the table. It seems the day and month are beingswapped around e.g. a date which should be the 12th April (12/04/2005)is saving as the 4th December (04/12/2005).The parameter used in the stored procedure comes from a VB6 app, Iamended this so the format was "yyyymmdd hh:mm:ss". The full line in VBbeing,Parameters.Append .CreateParameter("date_of_call", adChar, , 17,Format(firstCallDateTime, "yyyymmdd hh:mm:ss"))When I run my VB app it works fine, the syntax in the stored procedureis,CREATE PROCEDURE dbo.spUpdValues@data_id int,@date_of_call datetimeasupdate dataSET date_of_call = CONVERT(char, @date_of_call, 101)where data_id=@data_idIs it because the convert format is using an american date format ? Ican't see why as I can't reproduce this error using my own PC as thedate saves correctly, I can also confirm it's not happening to everybodywho uses the app. If it is happening for specifc users then what couldbe the cause. I've checked Regional Settings and all seems fine there.Any ideas on what could be doing this as I'm struggling to investigateany further.To debug I ran the stored procedure direct, manually inputting thevariable - again no problem. Also, the following SQL statment shows noproblem...declare @date_of_call datetimeset @date_of_call = '20041101 08:30:00'select CONVERT(char, @date_of_call, 101)select CONVERT(char, @date_of_call, 106)------------------------------11/01/2004(1 row(s) affected)------------------------------01 Nov 2004(1 row(s) affected)Any help would be much appreciated.*** Sent via Developersdex http://www.developersdex.com ***
View 10 Replies
View Related
Mar 6, 2006
Hi,
I need to copy a table between 2 different SQL servers.
There are a lot of different solutions :
-DTS
-Bulk Insert
-Replication
...
What are the pros and cons of these solutions ?
How to choose ?
Thanks
View 1 Replies
View Related
Jul 28, 2004
Hi all,
I am just trying to establish if there is a way that I can capture a direct data stream(feed) from another server and have it input into a table.
The scenario is -
I have a PABX that is outputting a stream of data with each record being 83 characters long. Each of the fields is seperated by a space. The end of the record is distinguished by a "Line Feed". The port it is being sent to on my local machine is Port 5000.
Is this possible to do or not? I know that some of our DB guys who use oracle can do it, I would like to have this as a MSSQL database so that I can use it and manipulate it a bit further.
Thanks.
View 2 Replies
View Related
May 8, 2007
Hi,I have a problem which I don't entirely know how to tackle:Essentially, is it possible to query a web service (via http), usingsql server 2000, and then import that data in to the database?I have seen many posts on openxml and sql servers bulk load facilitiesbut nobody seems to mention whether you can open an http stream andread the xml in from there.Any help would be greatly appreciated.Thanks.
View 3 Replies
View Related
Aug 3, 2007
I have run into a strange issue that I believe is a SQL Reporting Services issue.
I have a report laid out in landscape setting that has 4 columns of text. Two of the columns are sub-reports (due to the complexity and size, we did not flatten out the data in the stored procedure) and two of the columns are regular fields.
The 2 columns of regular fields are smaller, and normally only grow to about 1/2 the height of he page. One of the two sub-reports contains large amounts of text, and at time grows larger than the height of the page.
When the sub-report grows larger than the current page, it correctly starts up on the next page. But the 2 fields of data from the main dataset (not the sub-report columns) repeat themselves on the next page as well.
What is even more strange is the 2 fields of data from the main dataset only repeat data to grow vertically as far as the sub-report needs to grow. So if there is more data in either of these 2 fields than is needed for the sub-report to grow on the 2nd page, it will cut off the data in both of these fields.
I have tried placing the information in a group header. Turned the "Repeat on new page" both True and False, Took away the table header and footer, forced a page break after each group, tried using the "Hide Duplicates" property on the field within the details section, and nothing has seemed to fix the issue.
If anyone has run into this and found a work around, let me know.
Thank you,
T.J.
View 1 Replies
View Related
Jul 18, 2005
I have four tables that need to be loaded into an ASP.NET application. They need to be loaded together into one result set and sorted. Is it possible to load four tables together and sort them using an SQL statement?
To clarify, say I have the following data:
Table1: Anglesey, Cardiff, Ceredigion
Table2: London, Dorset, Lancashire
Table3: Antrim, Armagh
Table4: Glasgow, Berwick, Edinburgh
I'd want the data retrieved from all four tables and sorted so that the data retrieved would be:
Anglesey, Antrim, Armagh, Berwick, Cardiff, Ceredigion, Dorset, Edinburgh, Glasgow, Lancashire, London
I am aware of and am using the SELECT ... ORDER BY feature of MSSQL in my present ASP.NET application to retrieve from single database tables. I'm using merged datasets and a sort method to solve the above problem at the moment.
View 9 Replies
View Related
Feb 19, 2004
Hello guys,
Have any ever seen this error before?
Steps I made:
1) Created Publications by script - using Manual Sync. (Transacat Repl)
2) Created Subscriptions by script
3) Started Distribution Agent
THEN THIS ERROR MSG:
Violation of PRIMARY KEY constraint 'PK__@snapshot_seqnos__24F8BF73'. Cannot insert duplicate key in object '#24049B3A'.
(Source: SYD242 (Data source); Error number: 2627)
---------------------------------------------------------------------------------------------------------------
****Lost, has any got any ideas?
View 2 Replies
View Related
May 21, 2007
I'm implementing merge replication between SQL CE and sql server 2005 SP2. Here is the problem I'm facing
I have dynamic filters on some tables like
Select * from table1 inner join table2
ON table1.field1 = table2.field3
and table2.flag = true
When table2 flag turns true then all the matching records in table1 should be returned to the subscriber.
But that is not happening. The filter does not return any records to the subscriber even though i see records when i run the query on the server. My take on this is the publisher thinks there is no changes on table1 so there is nothing to update. So to fix that i run dummy updates like
update table1
set [name] = [name]
where table1.field1 in
(Select field2 from table2 where flag=true)
This updates the records and the publisher updates the subscriber the first time. When i set the flag to false, it is supposed to remove the data from the subscriber but i get this error.
The merge process failed because it detected a mismatch between the replication metadata of the two replicas, such that some changes could be lost leading to non-convergence. This could be due to the subscriber not having synchronized within the retention period, or because of one of the replicas being restored to a backup older than retention period, or because of the publisher performing more aggressive cleanup on articles of type download-only and articles with partition_options = 3. (Source: MSSQL_REPL, Error number: MSSQL_REPL-2147199363)
Get help: http://help/MSSQL_REPL-2147199363
The common generation watermark is invalid at this replica since it does not exist or metadata for changes not yet propagated may have been cleaned up. (Source: MSSQLServer, Error number: 21800)
Get help: http://help/21800
I have no idea whats going on. Any clues, anyone? Thanks for your help.
Ramesh
View 13 Replies
View Related
Dec 15, 2005
On the national Server: SQL 2005 Enterprise
On the mobile clients: SQL 2005 Workgroup.
I was asked to find a solution to such scenario
2 mobile subscriber S1 and S2 to the same simple merge publication P1 on server N
Day 1,
S1 and S2 both synch up with N and both go off to do fieldwork
Day3,
S1 and S2 both synch up with N.
S1 goes back to work
S2 shutdown the laptop and goes on 2-week vacation.
2 Weeks late
S1 Synch up wit the server and goes off to do fieldwork.
S2 meets S1 in the field. Their workfield is in the North Pole.
S2 has the laptop with data 2-weeks old but no longer can have access to the master publisher N to synch the replica and get latest changes.
S2 will have to sync with S1 since S1 database is fresh. The challenge is to have S2 and S1 replica identical
The Questions:
Is it possible for S2 to sync with S1?
if yes, How to go about it€¦ we need S1 and S2 to have identical replica on their machines?
Now that S1 and S2 are have identical databases and are both doing their fieldwork in the northpole. Can they both sync back with the national publisher N when they have access?
Keep in mind that S2 got its data updated from the replica on S1?
Thank you!
View 3 Replies
View Related
Apr 9, 2008
I'm trying to import some data from an Excel 2007 file into a SQL table. I created the Source Connection Manager and an OLE DB Source Data Flow Component which uses it. (Correct me if I'm wrong, but I can't use the Excel Source because of the version of Excel the file is saved in.) The outgoing Data Flow Path thinks some of the fields being imported should be of type float, when in fact they have alpha characters in them.
The fields in the database are defined as varchars.
A Data Conversion Transform doesn't seem right because I need the data to come out of the source as string data (which it actually is in the Excel file). Even if I convert it to string on the way to the destination, I would still be missing the original alpha characters.
How/Where do I change it (Source Connect Manager, OLE DB Source Data Flow Component, something else) to correctly identify the field's type?
TIA,
Christy
View 1 Replies
View Related
Mar 2, 2007
I've been setting up subscriptions to a merge publication for the past 3 days. All of a sudden, I'm getting a pile of very strange errors. Replication is configured. I have 16 subscribers to an existing publication configured and synchronizing changes without any issues. The script that I'm using to create all of the subscriptions is as follows:
use [PIC]
exec sp_addmergesubscription @publication = N'PIC', @subscriber = N'machinenameSQLEXPRESS',
@subscriber_db = N'MyDatabase', @subscription_type = N'Push', @sync_type = N'Automatic',
@subscriber_type = N'Global', @subscription_priority = 75, @description = N'', @use_interactive_resolver = N'False'
exec sp_addmergepushsubscription_agent @publication = N'PIC', @subscriber = N'machinenameSQLEXPRESS',
@subscriber_db = N'MyDatabase', @job_login = null, @job_password = null, @subscriber_security_mode = 1,
@publisher_security_mode = 1, @frequency_type = 64, @frequency_interval = 0, @frequency_relative_interval = 0,
@frequency_recurrence_factor = 0, @frequency_subday = 0, @frequency_subday_interval = 0, @active_start_time_of_day = 0,
@active_end_time_of_day = 235959, @active_start_date = 0, @active_end_date = 0
GO
The last one that I added #17, gives the following errors after successfully creating the subscription.
Command attempted:
{call sys.sp_MSmergesubscribedb ('true', 0) }
Error messages:
The merge process could not initialize the subscription. Ensure that the subscription registration exists at the publisher, and reregister the subscription if necessary. (Source: MSSQL_REPL, Error number: MSSQL_REPL-2147201011)
Get help: http://help/MSSQL_REPL-2147201011
RegCreateKeyEx() returned error 5, 'Access is denied.' (Source: MSSQLServer, Error number: 22002)
Get help: http://help/22002
Could not add article resolver 'Microsoft SQL Server Additive Conflict Resolver' information to the registry (Source: MSSQLServer, Error number: 21713)
Get help: http://help/21713
Could not register article resolver: 'Microsoft SQL Server Additive Conflict Resolver'. (Source: MSSQLServer, Error number: 21715)
Get help: http://help/21715
The system tables for merge replication could not be created successfully. (Source: MSSQLServer, Error number: 20008)
Get help: http://help/20008
I've tried to manually create it using the GUI and get an even stranger error message as follows:
TITLE: New Subscription Wizard
------------------------------
Microsoft SQL Server Management Studio is unable to access replication components because replication is not installed on this instance of SQL Server. For information about installing replication, see the topic Installing Replication in SQL Server Books Online.
For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft+SQL+Server&ProdVer=9.00.1399.06&EvtSrc=Microsoft.SqlServer.Management.UI.ReplUtilitiesErrorSR&EvtID=ReplicationNotInstalled&LinkId=20476
------------------------------
ADDITIONAL INFORMATION:
Replication components are not installed on this server. Run SQL Server Setup again and select the option to install replication. (Microsoft SQL Server, Error: 21028)
For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft+SQL+Server&ProdVer=09.00.1399&EvtSrc=MSSQLServer&EvtID=21028&LinkId=20476
------------------------------
BUTTONS:
OK
------------------------------
I very obviously have replication installed by virtue of the fact that I have a publication and 16 subscriptions that are currently synchronizing. Any ideas?
View 3 Replies
View Related
Aug 3, 2006
Publisher is 2005 x64, subscribers SS2000 (SP3) and SS2005 x64. Pull agents, no filters on subscriptions. We are seeing many seemingly random conflicts on between SS2000 subscriber and publisher. It happens on several different tables.
One table is never editted, only inserts happening everywhere and deletes happening on the SS2000 subscriber. Deletes will sometimes generate conflict. Reason is '"he row was deleted at 'CTS11.CTS' but could not be deleted at 'cts4a.cts'. Unable to synchronize the row because the row was updated by a different process outside of replication." CTS11 is SS2000 subscriber, CTS4A is publisher.
Probably unrelated bug but when looking at conflicts on this same table in SS2005 conflict viewer, get error "ID is neither a DataColumn nor a DataRelation for table summary (System.dATA)" and then "Column ID does not belong to table summary (System.Data)". ID column is rowguid, only unusual thing about table is that it has varchar(8000) field plus some other fields.
Other tables generate conflicts with this reason "The row was updated at 'CTS11.CTS' but could not be updated at 'cts4a.cts'. The merge process was unable to synchronize the row." I enabled verbose logging in the merge agent but the log file didn't contain any further explanation.
This same topology and schema worked fine when all publishers and subscribers were SS2000.
Any insight into how to fix this would be appreciated.
View 9 Replies
View Related
Dec 25, 2007
i ran a preview of a matrix based report whose column headers are dates. The dates seem to be displaying in a somewhat (not completely) random order from left to right. How can I ensure that they display chronologically from left to right?
View 1 Replies
View Related
Jun 15, 2015
I have a database that has entries that I want sorted by date order. Each entry has an auto ID number allocated (primary key auto sequencing), which I want to change to reflect the sorting (so the first date has the first auto ID number and so on).I've gone into the database and sorted the entries as I want them. Then I've gone into Design View to delete and restablish the primary key autosequence. However, it is not keeping the date order in the database (ie entry ID 3140 date is 12/06/2015, but 3141 is 02/02/2012). How do I get it to maintain the order?
View 3 Replies
View Related
Jan 17, 2008
Im having some issues with bulk insert.
This is the table:
CREATE TABLE [dbo].[tmp_GA_status](
[GA_recno] [int] NOT NULL,
[GA_desc] [varchar](40) NULL
)
This is the file (unicode):
1|"test1"
2|"test2"
3|"test3"
4|"test4"
5|"test5"
6|"test6"
7|"test7"
8|"test8"
and this is the sql:
bulk insert tmp_GA_status from 'C: empTextDumpGA_status.dta'
with (CODEPAGE='RAW', FIELDTERMINATOR='|', ROWTERMINATOR='
', DATAFILETYPE='widechar')
so yeah, pretty simple. But whatever I do I get this;
Msg 4864, Level 16, State 1, Line 1
Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 2 (GA_desc).
So what am I doing wrong ?
View 13 Replies
View Related
Jun 29, 2015
I'm trying to use Bulk insert for the first time and getting the following error. I think it might have something to do with my Format File and from the error msg there's a conversion error for the first column. In my database the Field is nvarchar(6) so my best guess is to use SQLNChar for the first column. I've checked the end of each line is CR LF therefore the is correct for line 7 right?
Msg 4863, Level 16, State 1, Line 1
Bulk load data conversion error (truncation) for row 1, column 1 (ASXCode).
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
BULK
INSERTtbl_ASX_Data_temp
FROM
'M:DataASXImportTest.txt'
WITH
(FORMATFILE='M:DataASXSQLFormatImport.Fmt')
[code]...
View 5 Replies
View Related
Dec 5, 2007
I need to convert an integer value fo for example 1071123 to a date field. This value would represent 107 = Year 2007, 11 = Month November, 23 = 23rd day of month. So effectively this would translate to 2007-11-23 00:00.000.
I would like to do this in the Integration Services package. I am retrieving data from an AS/400 system to an SQL Server DB. I'm not sure if I can do this with the Derived Column object or is there a better way to achieve this conversion.
Please, can anyone shed some light on this for me?
Thanks
CdnGator (Jason)
View 4 Replies
View Related
Jan 25, 2002
We have bulk copy option enabled for our DB and we really use it. Will it be possible to set up a snapshot replication over the Internet of particular tables to a remote server from which the data will be only retrieved and never changed? Also, is it necessary to have PKs in all tables for this one-way snapshot replication? (for transactional replication it is needed, as I know)
Thanks a lot!
Andy.
View 2 Replies
View Related
May 29, 2007
Hi ,
I got a problem in regarding Transactional Replication.
Let me explain my scenario.
I€™m doing transactional replication between two databases.
When publisher and subscriber created the data going to be bulk copied from publisher table to subscriber table.
My main intension was to create replication between different tables with different fields in which I got succeeded.
But main problem is I want to stop this bulk copy from publisher to subscriber.
Scenario 1: my subscriber table may contain some previous data which will be replaced with publisher data due to bulk copy.
I don€™t want this .I want to avoid this bulk copy and wants to create procedures(for insert, update and delete transactions) in subscriber which will take care of replication.
I achieved almost everything but not able to avoid this bulk copy during the creation of subscriber.
As I know the only way I can stop bulk copy is by creating subscription without subscription agent. But here without subscription agent the procedures(for insert, update and delete transactions)
won€™t get created in subscriber.
Help me regarding the above scenario and I need it urgently.
View 1 Replies
View Related
Jan 22, 2008
I know what it means. The problem is that I can't see where it's happening. Looking at SQL Profiler shows the info correct. Front end code only allows max of 50 characters for the title, 2000 for the announcement. Expiration date is picked from a date picker, subgroupID and who are both 12 characters and are session info (this is correct as it's used else where and they are working).
This is from SQL Profiler:
exec ws_Admin_Announcements_AddlAnnouncement @subgroupid = 'D4F4CB571A09', @title = N'over 12 characters', @announcement = N'<p>test</p>', @expiration = 'Jan 22 2008 11:41AM', @who = 'D05B47F2CFB1'
Class:1 public int insertNewAnnouncement(string subgroupid, string title, string announcement, DateTime expiration,string who, string connectionString)
2 {
3 try
4 {
5 dbConnection = new SqlConnection(connectionString);
6 dbCommand = new SqlCommand("ws_Admin_Announcements_AddlAnnouncement", dbConnection);
7 dbCommand.CommandType = CommandType.StoredProcedure;
8 dbCommand.Parameters.Add("@subgroupid", SqlDbType.VarChar, 12);
9 dbCommand.Parameters[0].Value = subgroupid;
10 dbCommand.Parameters.Add("@title", SqlDbType.NVarChar, 100);
11 dbCommand.Parameters[1].Value = title;
12 dbCommand.Parameters.Add("@announcement", SqlDbType.NText);
13 dbCommand.Parameters[2].Value = announcement;
14 dbCommand.Parameters.Add("@expiration", SqlDbType.SmallDateTime);
15 dbCommand.Parameters[3].Value = expiration;
16 dbCommand.Parameters.Add("@who", SqlDbType.VarChar, 12);
17 dbCommand.Parameters[4].Value = who;
18
19 dbConnection.Open();
20 intRowsAffected = dbCommand.ExecuteNonQuery();
21 }
22 finally
23 {
24 dbCommand.Dispose();
25 dbConnection.Dispose();
26 }
27
28 return intRowsAffected;
29 }
SP1 CREATE PROCEDURE ws_Admin_Announcements_AddlAnnouncement
2
3 @subgroupid varchar(12),
4 @title nvarchar (100),
5 @announcement ntext,
6 @expiration smalldatetime,
7 @who varchar(12)
8
9 AS
10
11 insert into tblannouncements (fk_strsubgroupid,strtitle,strannouncement,dteexpires,dtecreated)
12 values (@subgroupid,@title,@announcement,@expiration,getdate())
13
Table Layout
fk_strSubGroupID varchar 12strTitle nvarchar 200strAnnouncement ntext 16dteExpires smalldatetime 4dteCreated smalldatetime 4
Any help would be greatly appreciated. Thanks ^_^
View 17 Replies
View Related
Feb 14, 2008
My SQL script: -
CREATE TABLE #temp_1
(LEGAL_ENTITY varchar(3) ,
DESCRIPTION varchar(25)
)
DECLARE
@select varchar(2000),
@from varchar(2000),
@where varchar(4000),
@final varchar(8000)
SELECT @select = 'SELECT LEGAL_ENTITY, DESCRIPTION'
SELECT @from = ' FROM table_1 WITH (NOLOCK)'
SELECT @final = @select+@from
INSERT INTO #temp_1
( LEGAL_ENTITY,
DESCRIPTION
)
EXEC (@final)
SELECT LEGAL_ENTITY, DESCRIPTION
From #temp_1
drop table #temp_1
By using above scrript, I will get a error message in SQL2005 database:-
Server: Msg 8152, Level 16, State 14, Line 1
String or binary data would be truncated.
I will not error in SQL2000
or
I comment exec(@final) and directly put in the SELECT statement, th
There is some SELECT statement modification in between that force me must use the variables to construct the SELECT statement. So can anyone advice, how to avoid the error?
Please advice, Thank you.
View 10 Replies
View Related
Feb 4, 2002
I'm running SQL 7.0 SP3 on two different machines (one with additional hotfixes). I'm taking a nightly snapshot of imported data on Server1 and pushing it out to another SQL 7.0 server on our network, Server2. All but one table is copied successfully. On the final table, I receive the message, "The process could not bulk copy into table '"%"'."
Error Information Category: Data Source, Source: Server2, Number 4813.
Full error message: "Expected the text length in data stream for bulk copy of text, ntext, or image data."
I've looked up 4813, but it's pretty ambiguous/generic. Also, when I SELECT from Server1 and INSERT INTO Server2 in the QA, I receive no errors. Does anyone have any insight?
View 1 Replies
View Related
May 10, 2006
Hello everybody,
I'm trying to setup transaction replication between 2 servers. This is a
one-way replication: Server A to Server B, not Server B to Server A.
I am able to replicate all the tables except one. I added
commands to the agent so that it would create an output file, possibly with
more or better information.
Here is a portion of the error causing the failure
Agent message code 20037. The process could not bulk copy into table
'"tblSuppContractFee"'.
[5/5/2006 8:02:10 PM]01sqlft003.distribution: {call
sp_MSadd_distribution_history(4, 6, ?, ?, 0, 0, 0.00, 0x01, 1, ?, 6, 0x01,
0x01)}
Adding alert to msdb..sysreplicationalerts: ErrorId = 65,
Transaction Seqno = 000075400000ff9b000b00000002, Command ID = 6
Message: Replication-Replication Distribution Subsystem: agent
01sqlft003-EDGE-01SQLFT004-4 failed. The process could not bulk copy into
table
'"tblSuppContractFee"'.[5/5/2006 8:02:10 PM]01sqlft003.distribution: {call
sp_MSadd_repl_alert(3, 4, 65, 14151, ?, 6, N'01sqlft003', N'EDGE',
N'01SQLFT004',
N'EDGE_REPLICATION', ?)}
ErrorId = 65, SourceTypeId = 1
ErrorCode = ''
ErrorText = ''
[5/5/2006 8:02:10 PM]01sqlft003.distribution: {call sp_MSadd_repl_error(65,
0, 1, ?, N'', ?)}
[5/5/2006 8:02:10 PM]01SQLFT004.EDGE_REPLICATION: exec
dbo.sp_MSupdatelastsyncinfo N'01sqlft003',N'EDGE', N'', 0, 6, N'The process
could not bulk copy into table
''"tblSuppContractFee"''.'
Can somebody help me in finding a solution for this error? I don't see any
Error Text and there are no resources available for the error code throwing up
in the log file.
Thanks in advance.
View 9 Replies
View Related
Dec 18, 2006
Execute following T-SQL within Queary Analyzer of SQL Server 2000:=======================================DECLARE @dTest DATETIMESET @dTest='2001-1-1 1:1:1:991'SELECT @dTestSET @dTest='2001-1-1 1:1:1:997'SELECT @dTestSET @dTest='2001-1-1 1:1:1:999'SELECT @dTest=======================================You get what?This is my result which is weird:2001-01-01 01:01:01.9902001-01-01 01:01:01.9972001-01-01 01:01:02.000Then what's the reason of this weird problem?
View 3 Replies
View Related
May 12, 2015
We have a filtered publisher (SQL Server 2012) with several pull and push subscriptions that have been functioning fine for years.
We have just added a new pull subscriber from a remote server. We configured an alternate folder location on the subscriber (pointing to existing path on a remote server). We generated new snapshot and partitioned snapshot, and moved to subscribing server at alternate path.We're seeing a problem we've never seen before:
When the merge agent runs, it applies the schema and performs a few hundred bulk inserts, but then proceeds to 'download changes to subscriber' as though the snapshot is old (it is brand new and should have all transactions). We have never seen this before, i.e. all data is usually applied in bulk when creating a new or reinitializing an existing subscriber.
we marked the subscriber for re-initialization (from both the publishing and subscribing server) with a fresh set of snapshots. The same problem occurred.We have also tried in our test environment (which mirrors production) and the snapshot applies as expected (via bulk inserts).
(It might be relevant to note the publisher was recently upgraded in-place from 2005 to 2012.)
View 11 Replies
View Related
Jul 24, 2001
What I'm trying to solve:
I have an application that generates SQL queries, and sometimes uses
DISTINCT where the result set has no dupe rows. In terms of database
resources, I'm trying to figure out if it's worth it to change to app to be
smart enough to not use DISTINCT where it won't serve any purpose, or
whether to let it do the DISTINCT and save added complexity to the query
building application. I.e. what is the cost of DISTINCT where there are no
dupe rows?
What I want to know:
Can someone explain how the stream aggregate operator actually goes about
doing its work?
Does this always create a temp table for sorting and discarding duplicates
(for DISTICNT)? If the answer is "no or sometimes", how does it do so in
the case where a temp table is not involved? I noticed the the estimated
I/O for this operator was zero for some queries I wrote agains pubs. Does
this mean that the optimizer believes the temp table needed will fit
in-memory and creates it in-memory? Or does the estimated I/O figure not
included disk writes for work tables?
tia for any info
Bill
View 1 Replies
View Related
Oct 13, 2006
Hi everybody,
I dont know if this is the right place to put in this question.
The problem is
I have an application developed using VB 6.0 and SQl Server 2000. It was working fine on my machine. I had MS XP installed on my system alongwith SP1. But when i installed MS XP SP2, i got the error while saving the record.
the error is
"Protocol Error in TDS Stream".
I went to microsoft site searcing for the resolution of this error. They gave the solution of installing MDAC 2.8 SP1. But when i install MDAC 2.8, it gives me this error
"MDAC 2.8 RTM is incompatible with this version of Windows. All of its features are currently part of Windows."
And since MDAC 2.8 is not installed so i can not install MDAC 2.8 SP1 also.
I have MDAC 2.5 installed on the system.
How can i resolve it. Please give a solution as soon as possible. I am in great need of it.
View 1 Replies
View Related
Jul 20, 2005
Hi all,I am new to the ADODB.StreamI am using following codelRecordset.Open "Select * from <some table-name>"'this query return more than 1000 recordsdim lstream as new ADODB.stream'assigning the recordset data to the streamlrecordset.save lstreamlStream.Position = 0Dim lRecordset2 As New ADODB.RecordsetlRecordset2.Open lstreamMsgBox lRecordset2.RecordCountmy problem is that query is returning say 1500 records but when i amagain assigning the same stream to another recordset it is copyingonly 485 recordsthat is lstream is saving only 485 records...is there any size limiton stream...?how i can do this using stream only....If u have any solution plz reply back..Thanks in advance...
View 1 Replies
View Related
Jan 2, 2008
Hi all,
I have an odd issue. I have a query that returns one row from SQL Server. The value for one column (payment) in this row is 5.00. There are no other rows returned, only one row. However, when I do a sum on payment it returns 195! I dont know where its getting this from, especially since I can see all the rows by removing the SUM() and there only is one row. Here is the query:Select v_payments.tripid, Instransporttotal, lastName, firstName, tripDate, v_payments.payment,credit, translation, inscoid, v_payments.checkid
from v_billbuckets
left join v_patient on v_billbuckets.patientID = v_patient.ssnleft join v_payments on v_billbuckets.tripid = v_payments.tripid
left join v_InsCompany on v_billbuckets.inscompanyid= v_inscompany.inscoidleft join v_billRecItem on v_payments.tripid = v_billRecItem.tripid
WHERE v_payments.checkID = 315898GROUP BY v_payments.tripid, Instransporttotal, lastName, firstName, tripDate, v_payments.payment, v_payments.credit, translation, inscoid, v_payments.checkid
The above query returns one row. However, If I put SUM() around v_payments.payment in the SELECT line then it is 195. I dont know why it is doing this. Any ideas?
Thanks!
John
View 2 Replies
View Related
Apr 3, 2007
I have a report that is deployed at multiple sites without any problems, except one. At one site, when I render it to pdf, it doesn't look at all right, and the alignment on a table goes from center to right aligned for no obvious reason. It then stretches a textbox horizontally to force it onto another page. I've doublechecked and made sure that all the expand options on every single thing on the report is off, and I really don't understand why it would happen at only one site, even though the data is almost the exact same. Not only that, table cells don't stretch horizontally, they stretch vertically, so again, I'm stumped about why this one site would render incorrectly to a .pdf. Even in the report viewer, everything looks fine.
I am wondering if anyone has had a problem with rendering to pdf incorrectly when the rdl and data are correct, and what your solution was.
P.S. IT isn't a problem with margining, so please, don't give the generic "CHECK YOUR MARGINS" response.
View 3 Replies
View Related