Processed Files

Jul 4, 2006

Hi ppl

I have a confusing situation that I CAN resolve with cursors, BUT cursors are far too performance hungry to be run.

Tables: LogFileIn, LogFileOut,

I want to look on LogFileIn FLI for files that has a NULL LogFileOut(response to LFI) AND dont have any other value.

A file can have multiple LFO's depending on how many times that file gets processed.

It is confusing but bear with me.


Code:


select * from LogFileIn where IdentityFileChar = 'BPRTFILE'
and responseLogFileOutId is null



This does pull out a list of files that havent been processed, BUT that same file might have been processed again and succeeded. Therefore the results I get arent accurate.

With the cursor I would have taken the filename from the LFI and used that to check if that filename comes up again in the LFO as processed.

Did you get that? And can you help?? Im confusing myself!

This table records everytime a file is run, so there might be 20 identical LFI's all with different LFO's... Im looking for LFI's that have null, and only Null.

View 1 Replies


ADVERTISEMENT

SQL Query Not Being Processed By SQL Server

Oct 24, 2001

Hello all,
I have a query that keeps getting errors from our SQL Server. Can anyone see why ?

select job.key,job.job,job.item,job.job-date
From job
where job.job-date > (dateadd (year,-1,job-date))
order by job.job, job.item, job.key



I have tried bracketing the job-date but I still get errors saying it is
unable to understand the query. Thank you for your help in advance.

View 1 Replies View Related

Order For Conditions To Be Processed

Mar 3, 2004

what is order in which conditions are processed for sql query i.e for
select * from table1, table2 where cond1 and cond2 and cond3 which condition will be processed first (i.e. for optimination purpose condition cutting down max no. of row shud be placed first or last?)

View 3 Replies View Related

Highlight First Row From The Processed Data

Nov 28, 2007



Hi,
i have a report like this


Is there a way i can highlight only the first row and not other rows.

Its should look like this:

Texas
XYZ ABC Kingsville
PSD ATY city2
RTE RET city3


Is there a way i can do like this??

View 4 Replies View Related

Done: 1 Processed Of 1 Total; 1 Errors.

Apr 18, 2008

I'm working with SQL Server Reporting Services (SQL Server 2005)
I tried to configure a Data-Driven Subscription but the run failed with the error:
Done: 1 processed of 1 total; 1 errors.

I cannot find anything into the error log ReportServerService__<datetime>.log located at <Program Files>Microsoft SQL ServerMSSQL.#Reporting ServicesLogFiles.

How can I find useful information about the error message?

View 1 Replies View Related

How To Get Last Processed Date Of Stored Procedure

Jul 30, 2007

Hello everybody

Help me to get the last processed date of a stored procedure.

View 6 Replies View Related

Query Could Not Be Processed. Access Is Forbidden

Jul 20, 2005

I am using http to access my cube (MS Analysis Server) and there are userswho are getting "Query could not be processed. Access is forbidden"intermittently. Can someone help out and let me know what is causing thisissue? Some users are not getting this error.

View 3 Replies View Related

Received Message Constantly Processed.

May 21, 2007

Hello,

when is seemd that everything works some weird behaviours comes out.


I try to summarize the problem without to post the complete code.

Service Broker is set to have a dialog between two databases on the same SQL Server instance.



The Initiator queue has retention=on and there is an activation SP to handle errors and Target's end dialog message.

The Target queue has retention=off, MAX_READER =1 and there is an activation SP to receive the message (WAIT FOR (RECEIVE (1) ...), TIMEOUT 30000 and do something with this message (sample insert into a DB).



The conversation has a Timeout Dialog to end the dialog after a while.


The problem that the message is constantly processed. The Process doens't stop is I end the dialof after the processing either.

n.b.the Receive is within a Transation that I commit at the end.



some other informations that in the meanwhile I found out :

This was my complete WAIT FOR(RECEIVE :

WAITFOR ( RECEIVE top(1) -- just handle one message at a time
@message_type=message_type_id, --the type of message received
@messagetypename=message_type_name,
@message_body=message_body, -- the message contents
@dialog = conversation_handle -- the identifier of the dialog this message was received on
FROM [TargetQueue]
), timeout 1000;

if (@@ROWCOUNT = 0)
BEGIN
COMMIT;
BREAK;
END

IF I delet TIMEOUT 1000, everything works as expected ...
Inside if (@@ROWCOUNT = 0)BEGIN..END I wrote also an Insert into a table to see wheter the end of the queue was reached but this insert never occurs (neither with not without timeout)
I'm happy that it works what if this is the solution, it make no sense to me!

Any ideas?
Thank you!


M.B.





Thank you very much

M.B.

View 9 Replies View Related

Need To Get The Last Processed Cube Date In SSIS

Apr 15, 2008



Hi All,

I have a scenerio where i need to run the scheduled package for every 10 minutes. Let me give some examples. Say once my ETL process is done, my SS_Batch table gets inserted once. Here i have two columns like SS_Batch_cd and SS_Create_TS respectively. Once the ETL process runs successfully this SS_Batch_CD column will have values as 'C' which means 'Completed'. Similarly when ETL process fails this SS_Batch_CD column will have values as 'F' which means 'Failed'. And Similary when ETL process is in progress this SS_Batch_CD column will have values as 'P' which means it is in 'Progress'. In SS_Create_TS column, it will have date like 2008-04-15'.

Note : This SS_Batch table will get inserted only once when ETL job is over (whether the job is runned sucessfully or not or in Progress) along with the date.

Actually i need to run the package for every 10 minutes because i dont know when the last cube was processed. If i get this last processed cube date then i can check this processed cube date with SS_Create_TS column and SS_Batch_cd in SS_Batch table. Say if Last processed cube date is greater than SS_Create_TS then i can refresh or process the cube. This validation i need to do to achieve my goal.

What are all the control flows should we need to have in SSIS to achieve this scenerio? Please give me briefly to solve this problem.

Please advise on this.

Thanks in advance.
Anand Rajagopal.

View 4 Replies View Related

Display Number Of Records Processed

Aug 20, 2007

I've got a stored procedure that processes a TON of records...

What I would liek to do is to write a row to a "progress" table which shows how many rows have actually been processed.

I have a simple counter defined in the procedure:

SET @COUNTER = 0

Each time the procedure loops through, it increments by 1:

SET @COUNTER = @COUNTER + 1

What I would like to do is write rows to a PROCESS table which would reads:

PROCESSED 1000 rows
PROCESSED 2000 rows
PROCESSED ...... rows
etc.

I have a slight idea how to pull this off, but not sure about the whole even number thing by 1000.

If anyone has any insight it would be greatly appreciated!!

Thanks in advance!

View 3 Replies View Related

Does SP1 Change The Way Strings Or Data Are Processed?

Nov 1, 2006

We've just patched our Dev server, and, of our 3 servers (Dev, Test, Prod), we see major changes in the output of a raw data import process that runs nightly. Each night we import tables from a Remedy helpdesk system running on Oracle and place each ticket into a row on a table, tracking the changes and history of the ticket, etc. This includes tracking when supervisor groups are changed during the course of a ticket (ie Helpdesk to Data Comms to Billing etc). Now, after SP1, the results on Dev are skewed with partial strings showing in the From and To fields, broken in odd places (like the middle of words).

Has anyone noticed any changes in which post-SP1 SQL Server 2005 processes strings? Does it automatically trim spaces or convert NULLs etc?

The data import should be identical between Dev and the other servers.

View 3 Replies View Related

How To Stop The Dts Package In SSIS Once The Cube Is Processed

Apr 17, 2008

Hi All,
I have some clarifications on stopping my package once cube is refreshed or processed.

Below i have given steps for the transformation in my package

Let me give you what are all the dataflow transformations that i had given in my package.

1. Data Flow Task

2. Script Task 1- I have written code for getting the last processed cube (global variable has been declared for Last processed cube date - lastProcessedCube)

3. Script Task 2 - I have written code for SS_Batch table where i can get Create_Ts date that is assigned to another global variable - create_ts.

4. Analysis Processing Task.

In between Script Task 2 and Analysis Processing Task i have given @lastProcessedCube > @create_ts for Expression and Constraint under Precedence Constraint Editor

Actually i need to run package for every 10 minutes which i can do it in Job Schedule and need to refresh or process the cube daily. Is there any way to stop the package once when my cube is processed on that day. Again start the package for the next day.... Is it possible to do this? Please let me know.

Thanks in Advance,
Anand Rajagopal

View 14 Replies View Related

SQL Server 2008 :: Error - Database Page Could Not Be Processed

Oct 16, 2015

I have moved some DB's from a physical server to a virtual server while the physical server is being upgraded. This morning when I checked to make sure all the jobs are running fine I saw that a backup job failed for one of the databases. The error was :

[color=#ff0000]Executed as user: NT AUTHORITYSYSTEM. BACKUP 'database 1' detected an error on page (1:1400911) in file 'D:Datadatabase 1.mdf'. [SQLSTATE 42000] (Error 3043) BACKUP DATABASE is terminating abnormally. [SQLSTATE 42000] (Error 3013). The step failed.[/color]

Then I ran a DBCC check db command and found the following errors:

Msg 8928, Level 16, State 1, Line 1
Msg 8939, Level 16, State 98, Line 1
Msg 8976, Level 16, State 1, Line 1
Msg 8978, Level 16, State 1, Line 1
Msg 8939, Level 16, State 98, Line 1
Msg 8978, Level 16, State 1, Line 1

The messages varies from data from one page cannot be referenced on another, data page cannot be processed, page was not seen in the scan although the parent was etc etc.

View 9 Replies View Related

Query Analyzer : Losing The ___ Rows Processed Message

Jul 20, 2005

Hi;I have been writing a lot of quick and diryt tsql scripts to correctdatabase issues.I've been saving the output of the scripts for future records.I have been executing them through query analyzer.I find the messages query analyzer puts out to be obscuring to myoutput: ie" 6 rows processed "Is there anyway to shut this off?Steve

View 3 Replies View Related

Error With Analysis Service 2005: The Query Could Not Be Processed

May 22, 2008




I have created a cube with Analysis Service 2005. I then publish the pivot table (generated in Microsoft Excel 2003) as a web page. When I view the web page on the domain the LAN everything is working properly. But when the web page I view on Internet I get the error.



The query could not be processed:

* An error was encounted in the transport layer

* The peer prematurely closed the connection.



The web page is publish on the Internet Information Server, I changed the security directory of the website, but the error persists.

The firewall of the machine is disabled.

The port 2382 and 2883 are allowed.

The components needed to consult the cube are installed in the machine on which you are viewing the cube.

On the Rol of Analysis Service are allowed all users.


What other type of restriction may be?


Thanks...!!!

View 4 Replies View Related

Integration Services :: Write Key Value Of Records Processed To SSIS Log?

May 5, 2015

What's the best way to write key values of records processed in my SSIS 2012 package to the log provider chosen?My SSIS package deactivates widgets as well as thingies.  It was just released into production this week, runs daily, and we'd like to keep a close eye on what it's doing for a couple of weeks, by that I mean on any day be able to quickly see which thingies and widgets were deactivated that morning.  It typically deactivates less than 5 widgets and thingies per day.  

We could dig through the database to see which were deactivated, but that only works if somebody hasn't manually reactivated it since it was deactivated.  We need a log.  This is a temporary watch we're doing, so we don't want to write to a table or make make any significant package changes, such as adding new tasks.It seems like writing the 5-or-so deactivated thingy and widget key values to the log is the best way to watch this package.  What's the most efficient way to do this?  I'm hoping to avoid a new loop and script component with "Dts.Log" calls, but I don't know any other way.

View 3 Replies View Related

Analysis :: How To Get Last Processed Date For Each TABLE And Display In Measure

May 11, 2015

I can pull a last processed date for the entire model: URL...That works great for showing the last time anything in the entire model was processed.  It is not table specific. Updating / processing table A also changes the timestamp on tables B and C.However, if I want to look at just a specific table in the model (build a column for each fact table and a measure to go with it) I find that doing any process operation updates all the =Now() columns in all the fact tables.If I have a model with 3 fact tables and I do a process table using ProcessFull on one of them, all three tables calculated columns "LastProcessed"  =NOW() are updated. URL...

Type>ProcessFull</Type>
  <
Object>
    <
DatabaseID>MyModel</DatabaseID>
    <
DimensionID>TableA</DimensionID>
  </

[code]...

Is there a way to setup a measure in the model on a per table level to show last processed date for each individual table?  LastDataRefresh:= MAX ('TableA'[LastProcessed])

For example: Our sales table may update three times a day, where as our warehouse inventory table is only updated nightly.I wanted to let end users see by adding a measure for each table when the last process event was for a given table in the model.

View 2 Replies View Related

Ntext Columns Truncated To 32K When Processed By Integration Services

May 30, 2007

I have an Integration services application that retrieves big strings (over 32k) from an ODBC data source and stores them in a SQL server 2005 database as ntext.
Somehow the strings are truncated to 32K. The app only copies the data from the ODBC datasource to the (OLEDB) target.
Also, some of the special characters in the input text are not properly translated.

Any ideas?

View 10 Replies View Related

Create Trigger That Prevent Order Being Processed If The Amount Is 0 In Storage

Feb 9, 2006

When i process a order in table Orders_t i would like to check in storage_t if we defenetly have it in storage. .... if we have it in storage, i decrease the "amount" by 1 ..(amount -1), and process the order. Otherwise it will return nothing.

This is what i´ve come up with so far:


CREATE TRIGGER checkInStorage
ON orders_t
FOR INSERT, UPDATE

AS
DECLARE
@tOrderId char(3),
BEGIN
SET @tOrderId = (SELECT orderId FROM INSERTED)

--check if the amount in storage

IF EXIST(SELECT amount FROM storage_t WHERE orderId = @tOrderId and amount >= 0)
BEGIN --if it return true, i update the storage by decrease the amount with one
UPDATE storage_t
SET (amount = amount - 1)
WHERE orderId = @tOrderId
END




this doesn´t work...

View 6 Replies View Related

Reporting Services :: How To Pass Processed Value From Subreport To Main Report

Nov 27, 2015

I have a main report and 2 sub reports. I would like to pass the total Premium Paid from 2nd Sub report , and Total Bonus received from 3rd Sub report back to main report.

My scenario is actualy much more complicated than what i had attached below, which i cannot join the query for the 3 different reports together. But for demo purpose, i created the following sample scenario.

My main report is to display the sales summary by person by location.

A person may have more than 1 account number, and each account number is entitly for bonuses. As illustrated below.

View 2 Replies View Related

Using SSIS With SqlBulkCopy For ETL: How To Delete Processed Records From Staging Table

Jul 27, 2007

I'm looking at SSIS and SqlBulkCopy as a possible method to quickly insert and process large amounts of data, my current method uses the sp_xml_preparedocument and OPENXML to parse an xml document of the data I want to process and insert into the database, however I'm noticing the performance of SqlServer parsing the xml document isn't that good.

I found the C# SqlBulkCopy method (new in .NET 2.0) and I was thinking it would be faster to use that to load my data into a staging table and then use SSIS to extract the data from the staging table, process and transform it as necessary and finally load it into the final destination tables. I was able to create an Integration Services Project that selects the data from the staging table, does a bit of processing on one of the fields (by calling a stored procedure), and finally loads the processed data into the final table.

The problem with this is I need to clean out the rows that were extracted from the staging table and I'm not sure how I can accomplish this (and I can't just issue a "delete from staging_table" because there maybe new records in the staging table that were not processed), perhaps I can either delete each record as it is proccessed or somehow get the last proccessed identity id from the staging table and delete all records less than or equal to that id.

thanks in advance for any help you can provide, maybe there is an easy way to accomplish what I'm trying to do.

-Matt Palmerlee

View 3 Replies View Related

Reporting Services :: Refresh SSRS Report As Soon As Its SSAS Cube Data Source Is Processed?

Oct 12, 2015

how to trigger report's refresh following its underlying SSAS cubes process?

I want to keep reports data updated at all times, especially when this happens while user is already browsing the report..

I don't want to set an auto refresh for the report every 5 minutes as my cube is processed only one time during the day...

View 8 Replies View Related

Analysis :: Measures Showing No Rows In SSAS Browser After Cube Processed Ok And Calculate Command Is There

Jun 10, 2015

I've created a cube and it processed fine. The calculate command is there. The measure that I'm attempting to run in the SSAS/Visual Studios browser is simply a count rows measure.  When I drag the measure to the window, it says no rows available.  If I click on the filter that allows nulls, the only change it makes is that it goes from no rows available to "NULL". 

View 4 Replies View Related

Reporting Services :: Subscription Not Sending Email But Status Shows Done / 15 Processed Of 15 Total - 0 Errors

Aug 25, 2015

I have  data driven subscriptions which are working fine for some subscription and one i have newly developed which is used to send the missing punch report to employeesNow , when the subscription runs it shows that all have gone email in report server but some of they are not getting any email i have tested it on my id to send each ones report to me but i also get sometimes 2 of 15 , 4 of 15 or else but not all i also checked mailroot folder under my server's inetpub there are nothing stucks. How to check why is it not sending email to all employees ?

View 3 Replies View Related

Microsoft KB 308049: How To Call A Parameterized Stored Procedure By Using ADO.NET 2.0-VB 2005 Express-pubs Is Processed By ?

Mar 10, 2008

Hi all,

I tried to use the "How to call a Parameterterized Stored Procedure by Using ADO.NET and Visual Basic.NET" in http://support.microsoft.com/kb/308049 to learn "Use DataReader to Return Rows and Parameter" in my VB 2005 Express. I did the following things:

1) created a stored procedure "pubsTestProc1.sql" in my SQL Server Management Studio Express (SSMSE):


USE pubs

GO

Create Procedure TestProcedure

(

@au_idIN varchar (11),

@numTitlesOUT Integer OUTPUT

)

As

select A.au_fname, A.au_lname, T.title

from authors as A join titleauthor as TA on

A.au_id=TA.au_id

join titles as T

on T.title_id=TA.title_id

where A.au_id=@au_idIN

set @numTitlesOUT = @@Rowcount

return (5)

2) created a project "pubsTestProc1.vb" in my VB 2005 Express and copied the following code from http://support.microsoft.com/kb/308049 (i.e. Added the code to the Form_Load eventQL_Client) :


Imports System.Data

Imports System.Data.Client

Imports System.Data.SqlType

Imports System.Data.Odbc

Public Class Form1

Private Sub Form1_Load(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles MyBase.Load

Dim PubsConn As SqlConnection = New SqlConnection & _

("Data Source=.SQLEXPRESS;integrated security=sspi;" & _

"initial Catalog=pubs;")

Dim testCMD As SqlCommand = New SqlCommand & _

("TestProcedure", PubsConn)

testCMD.CommandType = CommandType.StoredProcedure

Dim RetValue As SqlParameter = testCMD.Parameters.Add & _

("RetValue", SqlDbType.Int)

RetValue.Direction = ParameterDirection.ReturnValue

Dim auIDIN As SqlParameter = testCMD.Parameters.Add & _

("@au_idIN", SqlDbType.VarChar, 11)

auIDIN.Direction = ParameterDirection.Input

Dim NumTitles As SqlParameter = testCMD.Parameters.Add & _

("@numtitlesout", SqlDbType.Int)

NumTitles.Direction = ParameterDirection.Output

auIDIN.Value = "213-46-8915"

PubsConn.Open()

Dim myReader As SqlDataReader = testCMD.ExecuteReader()

Console.WriteLine("Book Titles for this Author:")

Do While myReader.Read

Console.WriteLine("{0}", myReader.GetString(2))

Loop

myReader.Close()

Console.WriteLine("Return Value: " & (RetValue.Value))

Console.WriteLine("Number of Records: " & (NumTitles.Value))

End Sub

End Class
//////////////////////////////////////////////////////////////////////////////////////////////////////////////////
I compiled the above code and I got the following 15 errors:
Warning 1 Namespace or type specified in the Imports 'System.Data.Client' doesn't contain any public member or cannot be found. Make sure the namespace or the type is defined and contains at least one public member. Make sure the imported element name doesn't use any aliases. C:Documents and Settingse1enxshcMy DocumentsVisual Studio 2005ProjectspubsTestProc1pubsTestProc1Form1.vb 2 9 pubsTestProc1
Warning 2 Namespace or type specified in the Imports 'System.Data.SqlType' doesn't contain any public member or cannot be found. Make sure the namespace or the type is defined and contains at least one public member. Make sure the imported element name doesn't use any aliases. C:Documents and Settingse1enxshcMy DocumentsVisual Studio 2005ProjectspubsTestProc1pubsTestProc1Form1.vb 3 9 pubsTestProc1
Error 3 Type 'SqlConnection' is not defined. C:Documents and Settingse1enxshcMy DocumentsVisual Studio 2005ProjectspubsTestProc1pubsTestProc1Form1.vb 10 25 pubsTestProc1
Error 4 ')' expected. C:Documents and Settingse1enxshcMy DocumentsVisual Studio 2005ProjectspubsTestProc1pubsTestProc1Form1.vb 15 30 pubsTestProc1
Error 5 Name 'testCMD' is not declared. C:Documents and Settingse1enxshcMy DocumentsVisual Studio 2005ProjectspubsTestProc1pubsTestProc1Form1.vb 17 9 pubsTestProc1
Error 6 ')' expected. C:Documents and Settingse1enxshcMy DocumentsVisual Studio 2005ProjectspubsTestProc1pubsTestProc1Form1.vb 20 23 pubsTestProc1
Error 7 Name 'RetValue' is not declared. C:Documents and Settingse1enxshcMy DocumentsVisual Studio 2005ProjectspubsTestProc1pubsTestProc1Form1.vb 21 9 pubsTestProc1
Error 8 ')' expected. C:Documents and Settingse1enxshcMy DocumentsVisual Studio 2005ProjectspubsTestProc1pubsTestProc1Form1.vb 23 23 pubsTestProc1
Error 9 Name 'auIDIN' is not declared. C:Documents and Settingse1enxshcMy DocumentsVisual Studio 2005ProjectspubsTestProc1pubsTestProc1Form1.vb 24 9 pubsTestProc1
Error 10 ')' expected. C:Documents and Settingse1enxshcMy DocumentsVisual Studio 2005ProjectspubsTestProc1pubsTestProc1Form1.vb 26 28 pubsTestProc1
Error 11 Name 'NumTitles' is not declared. C:Documents and Settingse1enxshcMy DocumentsVisual Studio 2005ProjectspubsTestProc1pubsTestProc1Form1.vb 27 9 pubsTestProc1
Error 12 Name 'auIDIN' is not declared. C:Documents and Settingse1enxshcMy DocumentsVisual Studio 2005ProjectspubsTestProc1pubsTestProc1Form1.vb 29 9 pubsTestProc1
Error 13 Type 'SqlDataReader' is not defined. C:Documents and Settingse1enxshcMy DocumentsVisual Studio 2005ProjectspubsTestProc1pubsTestProc1Form1.vb 32 25 pubsTestProc1
Error 14 Name 'RetValue' is not declared. C:Documents and Settingse1enxshcMy DocumentsVisual Studio 2005ProjectspubsTestProc1pubsTestProc1Form1.vb 39 47 pubsTestProc1
Error 15 Name 'NumTitles' is not declared. C:Documents and Settingse1enxshcMy DocumentsVisual Studio 2005ProjectspubsTestProc1pubsTestProc1Form1.vb 40 52 pubsTestProc1
////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
First, I am completely lost here alreay. Second, I should have the following code from http://support.microsoft.com/kb/308049 too:

OLE DB Data Provider

Dim PubsConn As OleDbConnection = New OleDbConnection & _

("Provider=sqloledb;Data Source=server;" & _

"integrated security=sspi;initial Catalog=pubs;")

Dim testCMD As OleDbCommand = New OleDbCommand & _

("TestProcedure", PubsConn)

testCMD.CommandType = CommandType.StoredProcedure

Dim RetValue As OleDbParameter = testCMD.Parameters.Add & _

("RetValue", OleDbType.Integer)

RetValue.Direction = ParameterDirection.ReturnValue

Dim auIDIN As OleDbParameter = testCMD.Parameters.Add & _

("@au_idIN", OleDbType.VarChar, 11)

auIDIN.Direction = ParameterDirection.Input

Dim NumTitles As OleDbParameter = testCMD.Parameters.Add & _

("@numtitlesout", OleDbType.Integer)

NumTitles.Direction = ParameterDirection.Output

auIDIN.Value = "213-46-8915"

PubsConn.Open()

Dim myReader As OleDbDataReader = testCMD.ExecuteReader()

Console.WriteLine("Book Titles for this Author:")

Do While myReader.Read

Console.WriteLine("{0}", myReader.GetString(2))

Loop

myReader.Close()

Console.WriteLine("Return Value: " & (RetValue.Value))

Console.WriteLine("Number of Records: " & (NumTitles.Value))
//////////////////////////////////////////////////////////////////////////////////////////////////////
Now, I am completely out of touch with these two sets of the code from the Microsoft KB 308049 and do not know how to proceed to get the following output stated in the Microsoft KB 308049-see the below:




4.
Modify the connection string for the Connection object to point to the server that is running SQL Server.

5.


Run the code. Notice that the ExecuteScalar method of the Command object returns the parameters. ExecuteScalar also returns the value of column 1, row 1 of the returned rowset. Thus, the value of intCount is the result of the count function from the stored procedure.
////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
Please help and tell me what I should do to get this project landed on the right track.

Thanks in advance,
Scott Chang



View 16 Replies View Related

Tranform Large Volume Of Data. Sholuld It Be Processed Chunk By Chunk?

Dec 16, 2007



Hi,
I have to transform about 60 millions of data and it runs so slow that it never finishes in my testing. Should I have to process it chunk by chunk? Or is there any other techniques I can use (I am using data flow task). Thanks for advice.


View 12 Replies View Related

Conversation Handle Processed By An Activated Stored Procedure Service Can Not Be Invoked By A CLR Service Instance?

Dec 1, 2006

I have a initiator and a target service broker peer.

Both are controlled by a C# unit test. The initiator uses the Microsoft.Samples.SqlServer class. The target service uses stored procedure activation.

Sending a message from the initiator to the target, saves the content of the message, along with its conversation handle in the target's database specific table.

The unit test needs - at a later time - to instruct the target to send a message back on the same conversation handle to the initiator service.

For this the C# unit test creates a Conversation off of the saved conversation handle:


Service client = new Service("cleintservicename", conn, tran);

Conversation dialog = null;

dialog = new Conversation(client, convHandle);
Sending the message on this dialog generates an error "Message body: <Error xmlns="http://schemas.microsoft.com/SQL/ServiceBroker/Error"><Code>-8495</Code><Description>The conversation has already been acknowledged by another instance of this service.</Description></Error>".
Is the error due to the fact that a service - using the activated stored procedure already picked up the conversation, so that a new reference to the service can not be created through the Service class in CLR?
If so, I might need then to skip the activated stored procedure in favor or a CLR service, alltogether?
Any help - greatly appreciated.

View 7 Replies View Related

Northwind Database In SQL Server Management Studio Express Is Lost Or Used/processed By VB 2005 Express:How To Locate/return It

Dec 3, 2007

Hi all,

In the last one and half years, I used the Northwind Database in SQL Server Management Studio Express (SSMSE) to learn the programming of SqlConnections, Data sources, Database Exploere, ADO.NET 2.0, etc. via VB 2005 Express.

The Northwind Database in my SSMSE got lost very often, but I was not aware of it. How can I know where the Northwind Database is used or processed by my VB 2005 Express projects that were the examples of some tutorial books or my trial projects? How can I release the Northwind Database back to my SSMSE from the VB 2005 Express projects? Please help and advise.

Thanks in advance,
Scott Chang

View 2 Replies View Related

SQL Server Admin 2014 :: Separate Data Files / Log Files / TempDB / Backups

Jan 9, 2015

I proposed on a new server that we separate Data Files, Log Files, tempDB, Backups, etc. onto separate LUNS on a SAN with High Speed Solid State Drives.I was told that with the new technology with solid state SAN's that it would decrease performance and that it did not work the same way as it did when you had RAID 5's etc.I thought that if things were cared out correctly by a SAN Administrator they would know how to configure for optimal performance.

View 2 Replies View Related

For Loop - Iterate From Older Files To Newer Files Based On File's Timestamp

Mar 13, 2008

In the For Loop, How to Iterate from Older flat files to Newer flat files based on File's Timestamp. If there are some older files in that folder, it should be processed first and then continue with the newer one.

Any Suggestions?

View 3 Replies View Related

Script Task: How To Compare Files On FTP With Existing Files In Local Folder Before Transfer!

Apr 24, 2008

In the first step of my SSIS package I need to get files from FTP and dump it/them in a local directory, but it's more than that, the logic is like this:
1. If no file(s) found, stop executing and send email saying no file(s) found;
2. If file(s) found, then compare it/them with existing files in our archive folder; if file(s) already exist in archive folder, stop executing and send email saying file(s) already existed, if file(s) not in archive folder yet, then transfer it/them to the local directory for processing.

I know i have to use a script task to do this and i did some research and found examples for each of the above 2 steps and not both combined, so that's why I need some help here to get the logic incorporated right.

Thanks for the help in advance and i apologize for the long lines of code!

example for step 1:
----------------------------------------------------------------------------------------------------------

' Microsoft SQL Server Integration Services Script Task
' Write scripts using Microsoft Visual Basic
' The ScriptMain class is the entry point of the Script Task.

Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Runtime
Imports Microsoft.SqlServer.Dts.Pipeline.Wrapper
Imports Microsoft.SqlServer.Dts.Runtime.Wrapper
Imports Microsoft.VisualBasic.FileIO.FileSystem
Imports System.IO.FileSystemInfo

Public Class ScriptMain

' The execution engine calls this method when the task executes.
' To access the object model, use the Dts object. Connections, variables, events,
' and logging features are available as static members of the Dts class.
' Before returning from this method, set the value of Dts.TaskResult to indicate success or failure.
'
' To open Code and Text Editor Help, press F1.
' To open Object Browser, press Ctrl+Alt+J.

Public Sub Main()

Dim cDataFileName As String
Dim cFileType As String
Dim cFileFlgVar As String
WriteVariable("SCFileFlg", False)
WriteVariable("OOFileFlg", False)
WriteVariable("INFileFlg", False)
WriteVariable("IAFileFlg", False)
WriteVariable("RCFileFlg", False)
cDataFileName = ReadVariable("DataFileName").ToString
cFileType = Left(Right(cDataFileName, 4), 2)
cFileFlgVar = cFileType.ToUpper + "FileFlg"
WriteVariable(cFileFlgVar, True)
Dts.TaskResult = Dts.Results.Success
End Sub
Private Sub WriteVariable(ByVal varName As String, ByVal varValue As Object)
Try
Dim vars As Variables
Dts.VariableDispenser.LockForWrite(varName)
Dts.VariableDispenser.GetVariables(vars)
Try
vars(varName).Value = varValue
Catch ex As Exception
Throw ex
Finally
vars.Unlock()
End Try
Catch ex As Exception
Throw ex
End Try
End Sub
Private Function ReadVariable(ByVal varName As String) As Object
Dim result As Object
Try
Dim vars As Variables
Dts.VariableDispenser.LockForRead(varName)
Dts.VariableDispenser.GetVariables(vars)
Try
result = vars(varName).Value
Catch ex As Exception
Throw ex
Finally
vars.Unlock()
End Try
Catch ex As Exception
Throw ex
End Try
Return result
End Function
End Class

example for step 2:
-------------------------------------------------------------------------------------------------------

' Microsoft SQL Server Integration Services Script Task

' Write scripts using Microsoft Visual Basic

' The ScriptMain class is the entry point of the Script Task.

Imports System

Imports System.Data

Imports System.Math

Imports Microsoft.SqlServer.Dts.Runtime

Public Class ScriptMain

' The execution engine calls this method when the task executes.

' To access the object model, use the Dts object. Connections, variables, events,

' and logging features are available as static members of the Dts class.

' Before returning from this method, set the value of Dts.TaskResult to indicate success or failure.

'

' To open Code and Text Editor Help, press F1.

' To open Object Browser, press Ctrl+Alt+J.

Public Sub Main()

Try

'Create the connection to the ftp server

Dim cm As ConnectionManager = Dts.Connections.Add("FTP")

'Set the properties like username & password

cm.Properties("ServerName").SetValue(cm, "ftp.name.com")

cm.Properties("ServerUserName").SetValue(cm, "username")

cm.Properties("ServerPassword").SetValue(cm, "password")

cm.Properties("ServerPort").SetValue(cm, "21")

cm.Properties("Timeout").SetValue(cm, "0") 'The 0 setting will make it not timeout

cm.Properties("ChunkSize").SetValue(cm, "1000") '1000 kb

cm.Properties("Retries").SetValue(cm, "1")

'create the FTP object that sends the files and pass it the connection created above.

Dim ftp As FtpClientConnection = New FtpClientConnection(cm.AcquireConnection(Nothing))

'Connects to the ftp server

ftp.Connect()

'ftp.SetWorkingDirectory("..")

ftp.SetWorkingDirectory("directoryname")

Dim folderNames() As String

Dim fileNames() As String

ftp.GetListing(folderNames, fileNames)

Dim maxname As String = ""

For Each filename As String In fileNames

' whatever operation you need to do to find the correct file...

Next

Dim files(0) As String

files(0) = maxname

ftp.ReceiveFiles(files, "C: emp", True, True)

' Close the ftp connection

ftp.Close()





'Set the filename you retreive for use in data flow

Dts.Variables.Item("FILENAME").Value = maxname

Catch ex As Exception

Dts.TaskResult = Dts.Results.Failure

End Try

Dts.TaskResult = Dts.Results.Success

End Sub

End Class

View 16 Replies View Related

SQL 2012 :: FOR FILES Command To Delete Old Backup Files On Remote Server?

Feb 24, 2015

I have the need to delete old backup files via TSQL job. Found this solution online:

PushD "
emoteservershareDIFF" &&(
forfiles -m *DIFF*.sqb -d -1 -c "cmd /c del /q @path"
) & PopD

It works remotely if I run it via command prompt. But when I add this to a TSQL job on my remote SQL instance, it runs without deleting anything. What I'm missing?

View 6 Replies View Related

HELP!!! Cannot Import SSIS Package Files From .dtsx Files

Oct 8, 2007





Brief overview...Running SQL Server 2003 Server Enterprise 64 bit - All Service Packs and patches current
SQL Server 2005 Enterprise Edition 64 bit Build Microsoft SQL Server 2005 - 9.00.3054.00 (X64) Mar 23 2007 18:41:50 Copyright (c) 1988-2005 Microsoft Corporation Enterprise Edition (64-bit) on Windows NT 5.2 (Build 3790: Service Pack 2)

I cannot import any SSIS packages nor crete any new folders under stored packages. I hve googled the news groups and looked at BOL to no avail. HELP!!!!

View 20 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved