I create SQL queries on my wesite and I put them in files for the user (.sql, but any file type would be possible). I would like them to easily import them in their MS Access.
I can copy the SQL-line and paste it in Acces' Design View -> Query menu -> SQL Specific -> Select Union. But I think that is too complicated. I would just like to import them from a file to make it easier for the user of my site.
I would like to automate something presently done on a one-by-one basis. Here: a number of text files(containing data) are to be exported into an MS Excel file, with each text file to occupy a different worksheet. Presently, the idea is to use the Data/import external data/import data feature of MS Excel for importing the text files one-by-one into newly created worksheets(within the same workbook).
I would appreciate some advice on how to go about creating a useful MS Access application to achieve the above. I have checked the available Macros in MS Access, but I could not find one to suit my purpose. can anyone pls assist, on how I can get started?
hello everyone, i really need some detailed help as the deadline is approaching and I need to find a solution for this. Any help would be greatly appreciated
I currently have a batch file that ftps a text file from a Red Hat Linux Server to my W2k C:. I would like to make another command in the batch file that imports this text file into an existing access table. I would like the text file to repopulate the table everytime it is imported. I do not want the data added on to the existing data in the table.
I have MANY scanned image (.tif) files in multiple folders based on certain criteria. I need to find a way to import only the names of these (.tif) files into a table or even into an excel spreadsheet. Since they are scanned images you can not do a simple copy / paste.
I would appreciate if anyone has any ideas how this could be mastered
I have SQL code that would import an Excel file as follows:
Code: Str = "SELECT [Ed_TEST$].* INTO [tblAct_Import_File] FROM [Ed_TEST$] IN '" & impFile & "' [Excel 8.0; HDR = YES;];" cn.Execute Str
I do not want to use do not want to use "DoCmd.TransferSpreadsheet" I am finding do to rounding some figures are braught in a little off. However, I have found that if I use code.
I was working on project that involved writing the data into Access database using a C program. The insert query execution was taking very long, so I decided to write to a .csv file and then import it to one of the desired tables. This worked very fast as compared to directly writing to the DB. Now I want to automate this process. The user should specify the file name at the command prompt and I want to call a script so that the script automatically imports the contents of the .csv file into the access DB. I already have connection established to the DB. All I am looking for is the script that can automatically import the .csv file into access DB. Please help me out. :confused:
I have an excel file worksheet(player info sheet)that the user would input information. I then copy that info into another worksheet(player info) in the data fields that I have defined in Access. I then open up my Access database and do a file-get external data-import. I then select my excel file and the worksheet named "player info". I get the import fine but there is a table that gets created that is called: 'Player Info Sheet$'_ImportErrors. I cannot figure out why. Any help would be appreciated. Thanks.
I am importing a csv file with addresses and Post Codes on into a Properties Table. However, when I import the csv file, there is another field in the table called Area. I want the import routine to somehow use a lookup table that will contain all Post Codes and what Area they are in and then insert the relevant Area in each record in the properties table. I have tried using DLOOKUP in an update query but this just inserts the first Area in the lookup table into all records in the Property Table.
A curious problem, I import CSV data into an AC97 table and this has worked without any problems for years. However we have recently taken on a new member of staff and one of the fields being loaded are his initials 'SF' and this results in Import errors. I tested any other letters 'ME','MF' etc. and no problem - is 'SF' some kind of reserved word ?. Simple solution - call him 'SEB' not 'SF', but I wondered if anybody else has come across this ?
Hi I have a log file that records an action in following format. 50144021 12-17-2004 21:00:44 Mail Sent Subject: Test file TO: bert@xxxxxx.com
I want my database to look into this file and return the date/time of the last send in the log to match up with a record in one of my tables that has following fields: "email","last sent", "subject". ( The match will be done on the email address)
I can therefore look at each record and identify when each email address was last sent the file
I have a csv file, I am able to import it using the vba commande docmd.TransferText action. But the first three lines are made up of headers. I do not want to import these.
xxxxx, xxxxx, Field1, Field2, Fileld3,
Is there a way that I skip out the first 3 lines of the text file? ie only import from row 4 onwards?
Please Help
My current transfer method also creates a table of errors for the first three lines (as they do not match the data type spoecified by the target table). It also creates a whole lot of null fields in the first three rows of the table. I am running a loop for a whole lot of csv files so you can imagine I'm generating a lot of error import table. If I could import only rows 4 onwards there would be peace in the universe!
Does anyone know if there is a special way to import a CSV file starting from a specific row. I have CSV files and the first row is header information. The file info doesn't actually start until the second row. I know I can write a MODULE to do the task but is there an import specification I can use?
Hi all. Another question that i hope ya'll can answer.
I'm trying to take a comma divided text file and import the raw data from there into one Access table to allow for various data manipulations with the end result being a very nice printable report. Unfortunately, I can only get the data in a plain text file, and not a CSV file.
What I'm looking for is a method to where I can import one or many of these text files into a database at one time via a fairly automatic process (pressing a button to load all the text files in a given directory would work), and have the data filtered according to the pre-defined variables in the text file itself (which could just be pre-entered into the database as a template). How would I go about doing this?
p.s. If anyone wants to see an Excel file of a manual data sort to see what i'm talking about, please e-mail me and I'll send it off.
p.p.s. Thanks for any help you can give me
raw text file data (there's more, but this will suffice as an example):
I am currently trying to import a text file into an Access (2000) Table. My text file has a header line (first record) and a footer line (last record) which I want to ignore when importing. I am sure someone out there would have had a need to do something similar to this in the past. I am wondering if someone can point me in the right direction.
I have tried a few different google searches and nothing promising seems to come back. Mostly just references to check the first line for field headings box.
Working with a bank that wants a file from us so they can import into their check reconciliation program. Got the export from the accounting program working for the info they need, but the bank guy says he needs a file structure like this:
(first row is header..don't worry about what the stuff means for now) 1234578990000000000000000 (rest of rows are details) 1234589000807091234 1234589000807093456
notice the line breaks between header and rows? Wouldn't this call for a line break and mess up an import program? Bank guy isn't a database person and is clueless
Let me preface this question with... I DID NOT CREATE NOR DO I HAVE ANY CONTROL OVER THE FILE I'M NEEDING TO IMPORT INTO ACCESS.
I've got a situation where I'm needing to normalize a delimited .CSV file on a routine basis. The .CSV file has 369 fields. When normalized correctly, the true data should only be about 60 fields.
I didn't think this would be such a hard thing... just import the first 255 fields into one table, and the remaining fields into another table. Then, using a query... normalize the database as necessary.
I've scoured this topic all over... I've seen solutions for "fixed width" files, but not delimited. The only helpful thread I've found says that this is possible only through very complicated parsing through the file.
That's where I'm stuck... This is definitely over my head. If anyone has any help on this I sure would apprecaite it.
look for the best method. I have another software to work with my access. End of each month, ProgramA will generate an excel file with the monthly data. I want to import/link it with my access. I first try to import it everytime I generate the new excel file. However, there are one line at the end of the excel file with does not match the feild requirment, and generate an error table in access saying a number field cannot have string.
Then I try the link method instead. This time, it would work at all. The first time is OK, but the next time, I guess more lines are generate than the orginal in the excel file, it could not open up. Number of columns is the same.
The best method right now is to delete the last line of the new generated excel file, however, because I am not the one using it, I want to have a better method for my co-workers.
Are there ways to import excel data except the last line; or Are there ways to import excel file without an error table generate
I want to import a csv file and store the good data (which matches input masks, validation etc) into one table, and all the bad data which is rejected into another table.
I've read a bit about an import error table, and although I do get errors when importing the csv file, no such table is generated.
I searched on IMPORT, but didn't see anything like the problem I have.
I have a large Excel file formatted thus; COLUMN A COLUMN B COLUMN C 1 Full Name 2 Full address 3 City State Zip 4 Phone SSN Sex
Alas, as you can see, the first four rows contain information on one person, then Column B contains just their SSN on Row 4, and so forth.
Row 5 begins the cycle again. This goes on for 160 people.
Is there a way to get the employee information contained in Column A in a 'nomalized' format, such as Full Name in Col A, Full Address in Col B, and so on?
Unfortunately, it's illegal here to whack the person who provided this data to me.
Hi all, I'm trying to get the following done: I have code which import Excel files into my database with the "DoCmd.TransferSpreadsheet". which works great. But I'm now splitting the database and I want to have the Excel files which a user selects in his frontend database, imported in my backend database. I don't see the possibility in the TransferSpreadsheet as this is set to the CurrentDb. My temporary solution is to import in the frontend and CopyObject to the Backend, but does anyone have a direct solution?
I have a secure MS Access database, where users by default can only read data, but not write. How can I prevent them from importing or linking files, or inserting objects? I don't want to use any passwords in my database, since it's used inside the application. Thank you.
OK. I feel like an idiot but I did read the manual, Googled, and Microsoft help, but still cannot do it. The problem is on the import feature, there is no option to choose an Excel file.
I lowered the macro security level to take it out of "sandbox" mode, I reinstalled office and selected run all features again. I updated as well.
I tried blank databases to import to. No luck. I go to external data, import and I can choose ODBC, XML, sharepoint or Access files only.
I am using MS Office Pro 2003. Thanks for the help.