Can I Use SSIS To Port Directly From ODS To AS And Bypass Persisting In Intermediate DW?
Jul 20, 2007
At this link here:
http://www.microsoft.com/technet/prodtechnol/sql/2005/dwsqlsy.mspx
I read this snippet here:
Push data from the Integration Services pipeline or a custom application. The data can flow directly into an Analysis Services partition from the Integration Services package pipeline, without intermediate storage. This scenario can be used to reduce the latency (and storage cost) of analytical data.
What I think this means is that using SSIS I could bypass porting the data from ODS to DW, and instead use SSIS control flow/data flow components to direct the data straight to the Analysis Services Cube database storage partition.
I was hoping by posting this thread on this MSDN forum I could get details or links from those more knowledeable than I on this subject pointing me to how I may go about using SSIS to implement this approach. Perhaps package code examples or instructions about the components I could use to do this?
View 1 Replies
ADVERTISEMENT
Nov 22, 2007
is there a way to query an excel spreadsheet directly from sql without using ssis or excel macros?...and without saving the spreadsheet to a table first?
View 5 Replies
View Related
Sep 5, 2006
Can I connect directly to Outlook using SSIS? If so is there any white paper or walkthroughs I could follow?
TIA
Tom
View 1 Replies
View Related
Jan 23, 2007
Hi:
Just wanted to find out what port does SSIS utilizes when communicated with a remote SQL Server. Is it 1434, 1433 or something? I have a SSIS Package on Server 1, which which does inserts on SQL DB on Server 2. I think between Server 1 and Server 2, some ports are being blocked because the 2 servers are separated by a firewall. When SSIS makes connection to Server 2 (SQL), what ports does it use? Also, any other tips would also help. Here is the error I am getting:
Cannot connect to 10.xx.xx.xx.
===================================
An error has occurred while establishing a connection to the server. When connecting to SQL Server 2005, this failure may be caused by the fact that under the default settings SQL Server does not allow remote connections. (provider: Named Pipes Provider, error: 40 - Could not open a connection to SQL Server) (.Net SqlClient Data Provider)
View 12 Replies
View Related
Aug 5, 2015
We have SSIS packages deployed and scheduled on SQL_Server_A. The packages have to access databases on SQL_Server_B, which is behind a firewall.
Question: would opening only SQL port between these two servers be enough or some other ports have to be open as well? (The packages only select and bring data back to Server_A or transfer data from Server_A to the databases and do inserts. Nothing else.)
View 1 Replies
View Related
Jun 7, 2007
I have a pocket pc app that is using a sql ce (.sdf) database. The inserts, updates and deletes are not persisting when I exit debug. It seems like it's getting put in the database because I bind drop down lists after the inserts, updates, etc and the new/changed data shows correctly after rebinding the data (when I repopulate the drop downs I make a call to the database, I don't just manually add the new item to the drop down so it seems like the changes are getting to the database). However, once I stop debugging and restart debugging all my changes are gone. I'm not using transactions but is there something that I need to do to commit the db changes?
Thanks
View 3 Replies
View Related
Apr 7, 2006
I was able to write successfully a Script Task to set the values of several package variables. But when the execution completes, the variables still contain values which were specified by default.
What I want is a method of persisting the values assigned to a variable through a script. I believe this was possible in DTS
View 1 Replies
View Related
Jul 23, 2005
Hi all,I am rather new to database design and modelling concepts in generaland was hoping for some advice on a problem I am trying to solve. Ihave designed a piece of software that creates a tree with pluggablenodes. Each node class can have 0 to n distinct classes plugged intoit to define the type for that node.For example, a node plugged with a 'customer' class and an 'engineer'class would indicate that this node in the tree is an engineer who isalso a customer. We could also have a 'owner', 'engineer' etc.I now want to persist this tree in an SQL Server 2000 Database. I havechosen to implement the nested set model, and have thought about thefollowing table design:table NODE_TABLE:lft INTEGERrft INTEGERpropsID INTEGERtable PROPERTIES_TABLE:propsID INTEGERtableName VARCHARtable CUSTOMER_TABLE:propsID INTEGERfirstname CHARlastname CHARtable ENGINEER_TABLE:propsID INTEGERnum_completed_projects INTEGERdegree CHARschool CHARtable OWNER_TABLE:propsID INTEGERcompanyName CHARSo, given the above example - I would have a NODE_TABLE that links to 2entries in PROPERTIES_TABLE. One entry would link to an entry in theCUSTOMER_TABLE, the other to an entry in ENGINEER_TABLE.Are there any more efficient solutions to this problem? As i said, Iam very new to DB design and would welcome any feedback or suggestionsof how else I might model my pluggable tree in a Database. Thank you,Bob Yohan
View 4 Replies
View Related
Mar 15, 2007
I have built a Table Valued SqlFunction which streams out filtered results from a table. The way that I have implemented this is by using a custom class which implements the IEnumerator interface.
The class internally stores a SqlDataReader and a SqlConnection as private member variables. The Class initializer (ie: the New function) creates a SqlCommand which is executed with ExecuteReader into the SqlDataReader and returns.
The IEnumerator.MoveNext method loops through the SqlDataReader until it finds a result which matches the heuristic filter, and the IEnumerator.Current method returns the current result from the SqlDataReader.
Unfortunately as soon as the initializer returns the SqlDataReader is automatically closed, and I can't figure out why. I've debugged through this and at the end of the Initializer the SqlDataReader is definately open, and as soon as first call to MoveNext is run it is closed.
Can anyone offer any suggestions on how I can fix this.
In the interim i've had to load all of the filtered results into a temporary ArrayList in the Initializer, which defeats the purpose of streaming out the results.
View 3 Replies
View Related
Feb 6, 2008
Hello All,
I am a super newbie here.
I have a Form that submits to a new window with a reportViewer control in it.
I am using the request variables for the report parameters.
Everything works fine until I click a drill down link.
The error comes in the space where the viewer would be, and just says that the first report parameter has no value.
Sorry if this is a dumb question.
View 3 Replies
View Related
Aug 24, 2005
Hi,
I have relationships built for several tables and I want to bypass the FK restraints when deleting/truncating a table. The FK works fine when some one is trying to delete soemthing within the application, but on an administrative level, it is becoming VERY BOTHERSOME! Any help will be greatly welcomed!
__
Updated 8/24 @ 16:24
I tried a little experiment. I took the constraint off of my two main tables, "manufacturers" and "products". I tried to TRUNCATE the tables again and I got the same error message. My products seems to have other contraints with two other tables. I checked out the two tables and both of them are empty. What's the deal!?! Again, any help is welcomed!
__
Updated 8/24 @ 17:02
Ok so I tried to disable the constraints: ALTER products NOCHECK CONSTRAINT ALL, but it didn't work with truncating.
View 1 Replies
View Related
Jul 3, 2004
Well im trying to install some software on someone's cpu. I dont know his password, and he has forgotten. How do I get pass the login screen? and assign a new name and password. :confused:
View 2 Replies
View Related
Nov 22, 2006
is it possible to have the script component read X number of rows and then tell it don't read anymore, just pass this X number of rows to the destination?
View 4 Replies
View Related
Apr 30, 2015
I've recently started a new position and our production box. Contains a procedure that uses 30 + temp tables. I'm currently not in a position to change this as it's production and I would have to be granted a window to re-design.
However the tempDb is showing some strange activity.
If a table is created #CarrierService (CarrierServiceID,DeliveryZoneID,CollectionZoneID) for example
Once the procedure is called It will appear in the tempDB with the session info appended as expected
#CarrierService________________________________________________________2C78E45A
However once the session has ended the above table will get dropped and a new one created
#2C78E45A, I now have 7000 of these different Tables in the TempDB
When I Interrogate this using
SELECT o.name, o.create_date,o.modify_date , c.Name,C.Column_Id
FROM tempdb.sys.Objects o
Inner join tempdb.Sys.Columns c
ON o.object_id =c.Object_ID
WHERE o.type ='U'
I get the Following results
name create_date modify_date object_id name
#2C78E45A26/04/2015 18:0930/04/2015 14:55746120282CarrierServiceId
#2C78E45A26/04/2015 18:0930/04/2015 14:55746120282CollectionZoneId
#2C78E45A26/04/2015 18:0930/04/2015 14:55746120282DeliveryZoneId
Notice How It's getting Modified today.
View 9 Replies
View Related
May 19, 2008
Hi,
I am writing a BI solution for a recruitment company. In their
business, the can be n number of participants from different
dimensions linked to the same fact record. For example, a client can
be sent the CV of 50 candidates. That's my first problem. My second
problem is the variety of dimension participant types for a given fact
record. This results in the need for nullable dimension FK's - which
I'm trying to avoid. For example, consider the following two business
events. In the first one, a candidate fills a job. Easy, we have a
record in the fact table where the fact table has the following
columns: DateKey, EventType, CandidateKey, VacancyKey. No nullable
columns, great. But there are other events that I want to store in
the fact table too. Let's go back to my first example: The client is
sent CV's of 50 candidates in one transaction. So there is one client
linked to the fact, but 50 candidates. So now I need to extend the
fact table and add another column: CandidateGroupKey (which links to
and Intermediate Fact Table). But in this case there was no vacancy
involved. So do I now have to make the VacancyKey column nullable?
That doesn't seem like a good idea...
Or do I have to go for a completely different approach and have
different fact tables instead of just one?
Anyone have any suggestions?
View 1 Replies
View Related
Oct 24, 2013
I've been working through a programming diploma the last few years and have a solid foundation in Normalization, SQL, SQL Server, MySQL, Procs, Triggers, etc. Basically I'd say I'm at least a competent junior programmer on the back-end.. able to build simple joins, update data, and research/read/understand documentation.
Right now, and at the start of the year I was in a co-op term where I got some industry level SQL experience, so I know what a legitimate database looks like, and I've seen some scary looking procs. And now I've found that I quite enjoy working in SQL land and I'd like to take my skills up a notch.
With that said, the only method I can think of to improve my SQL skills is to find a sizeable sample database to load into SQL server, hopefully with corresponding exercises, and then to code those exercises.
View 4 Replies
View Related
Jun 4, 2007
Good Afternoon.
I have one question about clustered index.
The clustered index have one architeture such as be-tree...
What's the intermediate level on the clustered index?
Thanks..
View 5 Replies
View Related
Jul 11, 2002
I am looking for some advice. On a 6.5 database a user seemed to be having slow response, someone decided to stop and start the services. The database in question went into recovery mode. 24 hours later a network tech decided to stop and start the services again because the database was in recovery still. 5 hours later they decided to run the statement to put the database in Bypass Emergency Mode - "update sysdatabase set status = -32768 where name = 'dbname'", then they decided to set the status to 0 which made the database available again.
What can I do to ensure they did not damage the database? Can I compare the backup prior to the issue with the current database?
Thanks
Bill
View 2 Replies
View Related
Feb 3, 2000
Is it possible to have a login that can access a table, which has an Update trigger on a column, and do some updating on another column but not have the trigger fire?
I cannot disable the Trigger. This db is on production and the trigger cannot be taken off. I also cannot bcp the data out and do the updating and bcp back in. The production table must remain as is, however I need to update some cols without the trigger firing.
Any ideas??
Thank you,
tw
View 2 Replies
View Related
Sep 4, 2006
Hi all
My database 'a2rd' was marked as suspect
and i did the following steps to make it available
USE MASTER
select name, dbid, mode, status from sysdatabases where dbid =
db_id('a2rd')
UPDATE SYSDATABASES SET STATUS=-32768 WHERE NAME='a2rd'
EXEC sp_configure 'show advanced option', '1'
EXEC sp_configure 'allow updates', '1'
RECONFIGURE WITH OVERRIDE
DBCC NEWALLOC(a2rd)
DBCC TEXTALL (a2rd)
DBCC CHECKDB (A2RD
now how can i take the backup of the database that was in emergency mode..
Thanks in advance..
View 2 Replies
View Related
Oct 25, 2007
Greetings everyone,
I am seeing a particular problem in the XML Source Editor "Columns" configuration where it is not persisting the "Output name" selection.
Control Flow Tab:
1. I use a "Exec SQL Command" to drop, create, or alter the destination tables in the database that I want to be repository for the inbound XML data. The data types are fairly straightforward.
2. I add a singular "Data Flow"
Data Flow Tab:
1. I add a "XML Source" task, and assign a well-defined XML file. I then use the "Generate XSD" option in the "Connection manager"; and I am fairly satisfied with the generated XSD.
2. I create "OLE DB Destination"
3. I wire the "XML Source" to the "OLE DB Destination". In the "XML Source" in the "Columns".
4. I go to the dropdown list of "Output name" and see the list ordered with the various complex-types that I want to map and transfer to a target table.
For the sake of this report, I select the 5th one down on the list (for which I already have a target table) - let's call this "Mesh"
5. In the "Input Output" dialog, I select the "output" to be the desired 5th item, "Mesh"
6. I check all my mappings so that they map one-to-one ... XML name entries match SQL table destination mapping entries; correct types; correct size
7. Check the metadata and it all looks good.
8. When I hit "Debug" to test the package the failure occurs at the "XML Source". The error report comes back saying that it failed because "field xxx in Contributor was truncated". However, "Contributor" corresponds to the 1st name in the dropdown list presented in "Columns" "Output name:".
If I select return to Step 4, when I open up "Columns" I see that my previous selection of the 5th item on the list named "Mesh" was not persisted, but invariably and no matter how often I select item #5 "Mesh" and save to ensure that selection sticks, it is not persisted.
I hand-edited the .dtsx file and only then was I able to make this stick. However, if I ever re-save the package this non-persistency pops up again.
Am I doing something wrong here or is this a known defect? As I have several dozen XSD mappings that I want to transfer to tables, hand-editing is not something I relish.
I look forward to your reply.
RudyC
View 1 Replies
View Related
Aug 8, 2001
If I use DTS to import a text file, will insert triggers fire on this table?
View 1 Replies
View Related
Nov 21, 2007
1st question I have to ask is:
Is Emergency Mode the same as Bypass Recovery Mode?
2nd:
I have a SQL2005 Database that was suspect this morning. I put the database in Emergency Mode: alter database {dbname} set emergency.
I did a dbcc checkdb on the database and found some corrupt indexes. I attempted to do a dbreindex on the table and got the error:
Could not run BEGIN TRANSACTION in database '{dbname}' because the database is in bypass recovery mode.
How can I put the database in single_user mode and to a point where I can reindex the 1 table that is causing the problems. I would even be willing to drop the indexes and re-create.
Any suggestions or help would be appreciated!
Thanks in advance.
Pete
View 2 Replies
View Related
Nov 23, 2004
In May of this year I graduated from Penn State with a BS in IST (Information Sciences and Technology). Right after graduation I got a database programming job with a company that uses Delphi 6 and MS SQL Server 2000.
I've been working with this company for six months now but I'm still not very good with SQL. I can do basic SQL queries and table joins (as well as use datetime functions and cursors), but I'd say I'm only at an intermediate level (at best).
So... I'm looking to learn more about SQL. I'm guessing a good SQL reference book would help, but I'd really prefer a good book that actually teaches you and guides you along. The only problem is that I don't want a basic/beginner level SQL book since I already know all of the basics.
Can anybody recommend anything for me?
Thanks!
View 2 Replies
View Related
Mar 9, 2004
Quick question. I am in the NYC area (Westchester) and I am using Cognos DecisionStream to build my SQL Server 2000 Enterprise Data Warehouse.
I want to learn more about complex SQL, specifically for SQL Server.
What do you reccommend? Should I go for strictly an advanced SQL course, or should I go for SQL Server training.
Does anyone know of a really good site where I can find Advanced SQL training?
Also, I want to learn more about scripting and VB. Can anyone reccommend a place where I can get this beginner training.
Sorry to take up your time, but to sum up:
1. Advanced SQL training in NYC area
2. Beginner VB training in NYC area.
Cost is not necessarily an issue.
Thanks!
View 5 Replies
View Related
Jul 20, 2005
In May of this year I graduated from Penn State with a BS in IST(Information Sciences and Technology). Right after graduation I got adatabase programming job with a company that uses Delphi 6 and MS SQLServer 2000.I've been working with this company for six months now but I'm stillnot very good with SQL. I can do basic SQL queries and table joins (aswell as use datetime functions and cursors), but I'd say I'm only at anintermediate level (at best).So... I'm looking to learn more about SQL. I'm guessing a good SQLreference book would help, but I'd really prefer a good book thatactually teaches you and guides you along. The only problem is that Idon't want a basic/beginner level SQL book since I already know all ofthe basics.Can anybody recommend anything for me?Thanks!
View 2 Replies
View Related
May 18, 2015
I have one SP in which I am opening transaction and on fail I am rolling back transaction.
Is there any way I can preserve data in temp table before failing record.
I means to say At record no 21 get failed I need to see 20 records in any temp tables.
View 11 Replies
View Related
Oct 26, 2007
Hi,
I would like to call the report from an external source:
The report requires a paramater.
I tried the following:
<a href='http://Report.aspx?ItemTakeOn%2fItemTakeOn&ProcessNo=278'> Report</a>"
The link works & i added the parameter in the link (ProcessNo=278)
it opens the report but i still have to manually insert the Parameter.
Also tried setting the parameter as hidden.
How do i bypass this?
Anyone please Assist!
Regards
View 1 Replies
View Related
Aug 5, 2015
I have started working on SSAS since last week, I need to perform some calculations on the data fetched from the cube based on the parameters. SSRS is used to display the output and SSAS is used as a data source.
How i could perform the operations on the data fetched from the cube in SSAS? Does SSAS provides the storage structure like temp table in stored procedure where we can perform the various operations before sending final data back to the client side tool(SSRS)?
OR Is there any alternative way to perform the operations based on the input provided through the parameters
View 6 Replies
View Related
Jun 2, 2006
Hi,
I'm just starting off in SSIS and have a question that I can't find an answer to...
I'm loading in a number of files (in separate Data Flows) and performing some transformations on them before merging them back together. What I'm not sure about is what I should be doing with the data at the end of each of my "Import Data From XXXX Flat File" Data Flows. Am I better off using OLE DB Destinations (or SQL Server Destinations) and saving this intermediate data to temporary tables, or am I better off using a Raw File Destinations and saving this intermediate data to files? Or is there, perhaps, a better option that I'm currently unaware of?
If the Raw File Destination is the way to go, then isn't there a maintenance issue with cleaning up all the files created? And will there not be a management issue to ensure that there is sufficient disc space available on the drive you are saving to?
I'm a bit confused and overwhelmed by SSIS at the moment, so any help would be much appreciated!
Thanks in advance,
Lawrie.
View 3 Replies
View Related
Jun 16, 2004
i have a query and i would like to write the contents of the dataset out directly to a file and not bother with creating another temp table and then exporting it with a command with a dts to a csv file.
it is a type of reporting that i am trying to do
but i just need to export the raw data i retreive from the query
View 3 Replies
View Related
Nov 28, 2007
Hello!
Me and my collegues was recently assigned to maintain a SQL Server 2000 instance, which pretty much only holds one database.
From what we can tell, a backup of this database has never been created, atleast not through SQL Server.
When asking the previous owner, the answer was "We have been taking 'Legato backups' every day!"
We now intend to schedule a SQL Server backup job, but out of curiosity...
In case of a disaster, would these "Legato backups" have been any use at all?
Please elaborate :-)
View 5 Replies
View Related
Aug 24, 2007
I have a delivery extension that prints to network printers.
Now I add a subscription like a few minutes in the future via web service for each report to print.
Is there a way I can trigger the delivery directly?
Thanks for ideas, guys.
Michael
View 3 Replies
View Related