Linq To SQL Varbinary For Large Files
Jun 11, 2008
Hello,
I have decided to use Linq for my current ASP.NET project and so far it has been good, but now I am implementing a system that will allow users to upload binary content such as pictures and videos. For ease of management and security, I have decided to store this content directly in the database. The performance hit is a minor concern because very few user-uploaded images/videos will be seen on any given page (usually just one).
From the limited tutorials I have seen on the internet, Linq supports the SQL Server varbinary column through its System.Linq.Binary class. This class does not appear to support STREAMS and instead opts to load all of the contents into memory. This content can then be converted to an array of bytes, which can then be output to the browser via the response stream. This is not good. What if I am sending a video that is very large? Varbinary supports up to 2 GB. I can't have a 2 GB video sitting in memory. It makes a lot more sense to stream it via a small buffer.
Obviously, I am going to limit the size of the content that users can upload, but the core problem remains. If I limit content size to 2 MB and I have 2 GB of memory on the server, then I can only serve 1000 users concurrently. In reality, that number would be much less because of other processes running on the server.
Is there no way to stream data from a varbinary column with Linq using a small buffer of bytes?
Do I need to implement some custom logic on my Linq classes? Since these classes are automatically generated, how would I do such a thing?
Thanks.
View 1 Replies
ADVERTISEMENT
Jun 9, 2015
I have a table with raw scientific test results in a single field, some of which are over 25Mb field. I need to parse into the field to find and aggregate selected values from the field.
Table structure is
CREATE TABLE [dbo].[Gxxx_Data](
[id] [uniqueidentifier] NOT NULL,
[Status] [nvarchar](50) NULL,
[GxxxItem_ID] [int] NULL,
[Stats_Data] [varbinary](max) NULL,
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
[code]...
From which I need to parse and summarize the (Assembler) opcodes (MOV,CMPi, SHR etc...)I need to parse the large field [Stats_Data] to locate the target data.The internal result strings are delimited with Char(10), conservative counts are from 64k to over 100k lines in each record. Is there a way to parse the individual lines into another table (temp) that would be queried/regexed ?
View 9 Replies
View Related
Oct 20, 2000
Hi,
I have inherited some databases whith extremely large Log files.
I tried the truncate transaction log but did not work.
Can some body please tell me how to truncate these log files.
Thanks in advance.
Attaullah
View 2 Replies
View Related
Dec 5, 2000
I have some Large flat fiiles that I need to export to my SQL Server database. The file sizes range from 16 MB to 116 MB. I've tried to save the files to an excel sread sheet and then export them in that format, but that didn't work. does anyone have any suggestions?
View 1 Replies
View Related
Apr 4, 2007
i have a few tables using Sql Server 2005 Express. currently they are holding roughly 30-40k records in them. i have my log files set at restricted growth to 90 megs. while im not close to reaching that, i would like my tables to be able to scale up to possibly millions of records. based on that, i figure the transaction log file will prolly need to have a higher threshold (unrestricted growth). for those with experience, for tables that have millions of records, what are the average size log files i could expect.
is it a bad idea to just shrink the log file every night during off peak hours so that regardless of the amount of records i have, ill always start the day with a minimal log file?
do large log files have any effect on SQL performance?
View 3 Replies
View Related
Aug 16, 2006
We have SQL Server running on a Windows 2003 server, only because Backup Exec requires it. AT the location : C:Program FilesMicrosoft SQL ServerMSSQLData
there is this file: SuperVISorNet_log.LDF which is 15 Gb and is accessed daily. I apologize because I don't know what this is!
My question is: can this file be 'pruned' (for want of a better word) because it's taking up a lot of backup space.
View 17 Replies
View Related
Jan 25, 2008
I am trying to run a query that deletes duplicates records on a table with 24m records. The problem is each time I run it the log file fills up and I get an error saying the log file is full. For this reason the query never ends.
Is there anyway to turn of logging when running a query?
I think it also has to do with disk drive runng out of space as the log file is growing to over 12gb.
It is running in simple mode already.
View 11 Replies
View Related
Apr 9, 2008
how do i insert a large chunk of text into a table column. my project is to build a news website. where people can go and read news articles. the articles are provided by the author in word format, so how do i insert that news article into the table's column? any help would be appreciated
thanks
View 2 Replies
View Related
Feb 9, 2006
I have a table that I'm inserting a file into and using the Image data type to store the binary object. Now the code below works fine for files around 1.5 MB, but anything larger and it's like the code won't even execute and I get a Page Not found error.
I'm in the process of running some traces to find out what's going on in the backend, but I'm assuming there's something amiss with my code. The Image data type should handle files that size with no problem but for some reason it isn't.
Does anyone see anything wrong?
Thanks
Dim iLength As Integer = CType(File1.PostedFile.InputStream.Length, Integer)
If iLength = 0 Then Exit Sub 'not a valid file
Dim sContentType As String = File1.PostedFile.ContentType
Dim sFileName As String, i As Integer
Dim bytContent As Byte()
ReDim bytContent(iLength) 'byte array, set to file size
'strip the path off the filename
i = InStrRev(File1.PostedFile.FileName.Trim, "")
If i = 0 Then
sFileName = File1.PostedFile.FileName.Trim
Else
sFileName = Right(File1.PostedFile.FileName.Trim, Len(File1.PostedFile.FileName.Trim) - i)
End If
conn = New SqlConnection(eco)
conn.Open()
cmd = New SqlCommand("INSERT INTO ECO_Attachments (ECOID, FromType, DocName,OldRev,NewRev,NtLogin,DisplayName, FileName, FileSize, FileData, ContentType) VALUES (@ECOID, @FromType,@DocName,@OldRev,@NewRev,@NtLogin,@DisplayName, @FileName, @FileSize, @FileData, @ContentType) ")
cmd.Connection = conn
Try
File1.PostedFile.InputStream.Read(bytContent, 0, iLength)
With cmd
.Parameters.Add("@ECOID", SqlDbType.Int)
.Parameters.Add("@FromType", SqlDbType.NVarChar, 50)
.Parameters.Add("@DocName", SqlDbType.NVarChar, 250)
.Parameters.Add("@OldRev", SqlDbType.NVarChar, 50)
.Parameters.Add("@NewRev", SqlDbType.NVarChar, 50)
.Parameters.Add("@NTLogin", SqlDbType.NVarChar, 100)
.Parameters.Add("@DisplayName", SqlDbType.NVarChar, 200)
.Parameters.Add("@FileName", SqlDbType.NVarChar, 255)
.Parameters.Add("@FileSize", SqlDbType.Real)
.Parameters.Add("@FileData", SqlDbType.Image)
.Parameters.Add("@ContentType", SqlDbType.NVarChar, 50)
.Parameters("@ECOID").Value = ECOID
.Parameters("@FromType").Value = From
.Parameters("@DocName").Value = DocName
.Parameters("@OldRev").Value = OldRev
.Parameters("@NewRev").Value = NewRev
.Parameters("@NTLogin").Value = NTLogon
.Parameters("@DisplayName").Value = DisplayName
.Parameters("@FileName").Value = sFileName
.Parameters("@FileSize").Value = iLength
.Parameters("@FileData").Value = bytContent
.Parameters("@ContentType").Value = sContentType
.ExecuteNonQuery()
'.ExecuteScalar()
End With
Catch ex As Exception
Response.Write(ex)
'Handle your database error here
conn.Close()
End Try
View 1 Replies
View Related
Dec 7, 2000
Here's my delema, I have a file that's 308 bytes wide by 5.7 million records. The record length is fixed and the position and width of the known within the record. When I run DTS I recieve this error Msg MS DTS flat file provide and Err Diesdription: error creating file mapping view: not enough storage is available to process this command. Then when I try to continue with the wizard, it will not allow me to separate the data into the format that I need. Is there any other way to import this file using DTS?
View 1 Replies
View Related
Sep 20, 2005
Hi my data files sit in the default directories and I think they are causing my partition to run out of space. I mainly use one db that I created but don't use the others (ie master, model, tempdb, etc). Yet I see their MDF and LDF files are growing. What can I do to shrink them down or perhaps move them off to a larger partition after shrinking?
View 6 Replies
View Related
Dec 19, 2007
Hi€¦
During my web search looking for a solution I ran across SQL CE 3.5 articles. My questions about SQL CE 3.5 are:
1) Can SQL CE 3.5 handle a 4 €“ 6 GB file
- Read
- Parse (SQL)
2) Can SQL CE 3.5 act as a standalone client that a user can view a large (4-6 GB) text file?
- Will I need a .NET (small) client to read the large (4-6 GB) text file?
More info:
The text file will reside on the machine where the SQL CE 3.5 is installed. There is no pull to get the data.
Thank you (in advance)€¦
SQL CE 3.5
View 3 Replies
View Related
Oct 22, 2007
Hello,
I created a SSIS solution for reading data from dbase and storing them in SQL Server. In a ForEachDirectory-Loop up to one thousand dbase files are read and stored. The system where the packages are running has 16 GB RAM.
For the first few hundred dbase files everything goes fine, but then, the RAM seems not to suffice any more and a temp file is created (I changed the path in BufferTempStoragePath).
How can it be that there is a need to create temp files if there is so much RAM available?
Why is the RAM filled more and more during the SSIS package execution?
Is there anything I can do to release some of it? (it is running in a loop and there is no need to store all the data)
Could it be caused by dbase?? (I use Microsoft Jet 4.0 OLE DB Provider)
Another thing is that the temp file is not stored in the path I set in BufferTempStoragePath.
There are sufficient permissions set, but temp file is still created in user temp folder...
Any kind of help is very much appreciated!
Best Regards,
Stefan
View 5 Replies
View Related
Jul 28, 2014
I need to create script that will import large XML files (500 - 7GB) on a daily basis and store the data in a relational db structure.
What is the best and fastest way of importing such files. I have played around with smaller files and found the following.
1. SSIS XML Data Source: It doesn't seem to like the complex elements types and throws out the file.
2. Using Bulk File Import, sorting the file in XML variable and using XQuery to parse the file: This works but it can't take a file more than 2GB in size, so I can't use this method.
3. C# + XML Serialization: This also works, but seems to be terribly slow. I open the DB connection once, so it doesn't open and close for each db call, but still seems like it takes a long time.
how to import large XML quickly in a relational table structure?
View 9 Replies
View Related
Aug 13, 2007
What is the easiest way to get a large fixed width text file (200 columns) defintion into SSIS? To have to define each column with the ruler would be very cumbersome.
View 5 Replies
View Related
Jul 16, 2007
Hi ,
Is there any method by which I can divide the large flat file into certain number of small files keeping the header in each of the sub files?
Regards,
Prash
View 4 Replies
View Related
Aug 29, 2007
I have several databases that have grown to 300 GB and would like to distribute the data into multiple files across multiple drives. Can I create a new database that is spread across the new drives and use a full backup to restore or am I stuck with unloading the data table by table?
View 3 Replies
View Related
Sep 11, 2007
Hello,
I am attempting to restore the database from within VB.NET application I am making the following 3 calls:
RESTORE FileListOnly FROM DISK = 'C:MyDatabase.dat'
USE Master RESTORE DATABASE MyDatabase FROM DISK = 'C:MyDatabase.dat' WITH NORECOVERY, MOVE 'MyDatabase' TO 'C:Program FilesMicrosoft SQL ServerMSSQLDataMyDatabase.mdf', MOVE 'MyDatabase_log' TO 'C:Program FilesMicrosoft SQL ServerMSSQLDataLDFMyDatabase.ldf', REPLACE
RESTORE DATABASE MyDatabase FROM DISK = 'C:MyDatabase.dat'
using SMO. This logic works fine with small *.dat files, however when using *.dat file of about 4Gb I get an error on the 3d restore database call:
ExecuteNonQuery failed for Database 'master'.
An exception occurred while executing a Transact-SQL statement or batch.
Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.
Operator aborted backup or restore. See the error messages returned to the console for more details.
ExecuteNonQuery failed for Database 'master'.
An exception occurred while executing a Transact-SQL statement or batch.
Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.
Operator aborted backup or restore. See the error messages returned to the console for more details.
The same program/logic also works fine when I use MS SQL 2005 and it runs fine from MS SQL 2005 Query Analyzer for both 2005 and 2000 databases. There seem to be only problem with MS SQL 2000 from within VB.NET. Anybody has any idea? I'd appreciate any response. Thanks
Eugene
View 6 Replies
View Related
Jun 24, 2014
I have a master table containing details of over 800000 surveys made up of approximately 400 distinct document names and versions. Each document can have as few as 10 questions but as many as 150. Each question represents one row.
My challenge is to create a separate spreadsheet for each of the 400 distinct document names and versions containing all the rows and columns present in the master table. The largest number of rows would be around 150 and therefore each spreadsheet will not be very big.
e.g. in my sample data below, i will need to create individual Excel files named as follows . . .
"Document1Version1.xlsx" containing all the column names and 6 rows for the 6 questions relating to Document 1 version 1
"Document1Version2.xlsx" containing all the column names and 8 rows for the 8 questions relating to Document 1 version 2
"Document2Version1.xlsx" containing all the column names and 4 rows for the 4 questions relating to Document 2 version 1
I assume that one of the first things is to create a lookup of the distinct document names and versions assign some variables and then use this lookup to loop through and sequentially filter the master table data ready for creating the individual Excel files.
--CREATE TEMP TABLE FOR EXAMPLE
IF OBJECT_ID('tempdb..#excelTest') IS NOT NULL DROP TABLE #excelTest
CREATE TABLE #excelTest (
[rowID] [nvarchar](10) NULL,
[docName] [nvarchar](50) NULL,
[Code] .....
--Output
rowIDdocNamedocVersionquestionblankField
1document11q1NULL
2document11q2NULL
3document11q3NULL
4document11q4NULL
5document11q5NULL
6document11q6NULL
[Code] .....
View 9 Replies
View Related
Dec 8, 2006
Hello everyone,
This is my first time posting here, I hope this questions has not been asked before. I tried to search for it but I came not with nothing.
Recreating the error :
I am using VS2005. I created a Pocket PC 2003 project.
I have downloaded the SQL Server Compact Edition and installed it. I get the System.Data.SqlServerCe.dll file from the installation directory.
I reference to that DLL using Add Reference in VS2005.
Build it. In the Bin folder, a long list of files suddenly appears.
System.data.dll
System.data.oracleClient.dll
system.web.dll
system.enterpriseservices.dll
system.enterpriseservices.wrapper.dll
system.transactions.dll
and the rest of your original files in Bin
The worst of it all, all of these files are deployed into the Emulator! Causing it to run out of memory and unable to deploy.
Something is not right here, I just cannot figure it out! If this happens, each mobile devices can hold one applications. Thats not the way it should be, right?
If you have solved this before, do help. I am at my wits end at the moment.
Thanking you in advance.
Sincerely,
Lasker
View 1 Replies
View Related
Jul 4, 2006
Hi,
We are processing 60,00, 000 rows(2 GB file) available in a flat file and loading them in to a database tables using OLEDB Destination components. In the data pipeline of an SSIS package we have 1 flat file source reader, 7 look up components(full cache mode), 1 multicast component and 2 OLE DB destinations with fast load option.
We have observed that first 10,00, 000 rows are processed and loaded in to target tables in just 4 minutes time. The second set of 10,00, 000 rows are processed in 15 minutes time. After this for processing each 1,00,000 rows SSIS is taking approximately 8 - 10 minutes time. We are not able to identify the reasons for the unexpected behaviour of SSIS.
We thought that as the input file size is 2 GB SSIS is not able to manage and slowing down over time of execution. We did split the big input file in to 60 small 37 MB (approx) size files. Then we modified the package by adding For-Each loop task to process all the 60 small files and load them in to database server sequentially. Even in this approach also we have identified data loading has slowed down drastically after processing 13 files.
In order to verify is there any problem with reading source file or transformation, we have replaced OLEDB destinations component with Flat File destinations. With Flat file destination the time taken for processing rows is very constant. For every 8 minutes package is able to process 10,00,000 rows and write them in to the destination files. So, there is no problem with the with either Look up components or flat file source reader.
We are sure that target database server is in same state/condition from the starting to the end of package execution. The client box in which we are running the package is having 1 GB RAM. During package execution time the CPU usage is at 30 % and PF usage is 580 MB. SP1 is also installed on both Client and Server.
Does any one have clue what is causing slow down of data load over the time of package execution?
View 3 Replies
View Related
Jul 26, 2006
I'm currently experiencing major problems with SSIS when opening and editing large .DTSX package files that contain Exec DTS 2000 Tasks which have the package data loaded internally. I have no issues if I point the task to a .DTS file, or to an actual DTS package on a SQL 2000 server - but if I load the package internally then once the underlying .DTSX file gets over around 17MB or so in size (which doesnt take long making a few edits to even fairly simple packages now and then), I start to experience major issues with VS/BIDS 2005 crashing randomly when I try to perform any action with the package (open, save etc). Things like OutOfMemory exception errors, followed by the properties of Exec DTS 2000 task being deleted, and also sometimes accompanied by messages about the application not being installed properly.
Again its ONLY when the underlying .DTSX file reaches a certain size limit, and only when I've got an Exec DTS 2000 task with the package loaded internally. I've replicated the issue using several different package files on several different machines (even on servers with lots of memory, fwiw).
Can anyone out there help me with this? SSIS - namely SSIS Exec DTS 2000 package tasks - are our lifeblood at my company and this trend of random and serious crashing on large package files is very disturbing to say the least.
thanks,
Wil
View 1 Replies
View Related
Jan 8, 2008
Hello!
I try to get a list of ConditionsVersion where Version is MAX for each ConditionsVersion.
I tried something like this (as seen on http://msdn2.microsoft.com/en-us/vcsharp/aa336747.aspx#maxGrouped):
1 List<ConditionsVersion> list = (from cv in ConditionsVersions2 group cv by cv.FKConditions into cv3 select new { 4 PKConditions = cv.PKConditions,5 FKConditions = cv.FKConditions,6 MaxVersion = cv.GroupBy.Max(cv => cv.Version)7 CTimestamp = cv.Timestamp8 }).ToList();
But it doesn't work. It would be great if someone knows why.
Thank you!
View 2 Replies
View Related
Nov 2, 2007
Well, just played a little bit with that new thing from Microsoft. Genius! Microsoft presented that step backwards as a step forward.
Say good bye to the 3-tier architecture, now any programmer, after 1 week training, will be able to put SELECT * into the source code. No more stored procedures and logic on a server. No more ugly WHERE clauses. Just SELECT * and pass all records in a loop :)
When I looked at the queries, generated by LINQ in SQL profiler, I noticed that they are generated automatically using the same pattern. It is obvious, of course, but now it would be really difficult to trace a problematic query back to the C# code. All updates to table X will look like as identical twins!
On the other side, it is not so bad. We will have soon a lot of projects, failing when they go to the production and face the real volumes of data. And a long queue of companies, crying and asking to save them. Perfect “job security�. Please, use LINQ! Port all your code to LINQ immediately (Laughing demonically like Dr. Evil)
Hm… a second thought, but what could we suggest to these companies, having performance problems with LINQ 3rd party applications, when there is no source code? Now we could at least modify some stored procedures, and with LINQ looks like the only recommendation could be “contact a developer of that application or buy more a powerful server�.
View 18 Replies
View Related
Apr 20, 2008
Is it possible to use LINQ to SQL with Report Services? If so, are there any examples, tutorials, etc.?
View 1 Replies
View Related
May 28, 2008
Is Linq a feature of sql server 2008 ?
Or is it a feature of DOt net. Visual Studio 2008 ?
View 1 Replies
View Related
May 28, 2008
Some say linq to sql leads to the death of stored procedures is it correct ?
View 4 Replies
View Related
Nov 8, 2006
Will Linq be compatible with Sql Server Everywhere without having to add additional plugins?
View 7 Replies
View Related
Dec 30, 2007
Hullo am using Asp.Net 3.5, I want to create a usercontrol that is supported to all projects to upload file into sqlserver,
the user just give the database connection string,
tablename, column name depending upon their need, here I develop the
code using LINQ technology.
I Write a class with simple format like below,
using System;
using System.Data;
using System.Configuration;
using System.Linq;
using System.Web;
using System.Web.Security;
using System.Web.UI;
using System.Web.UI.HtmlControls;
using System.Web.UI.WebControls;
using System.Web.UI.WebControls.WebParts;
using System.Xml.Linq;
using System.Data.Linq;
/// <summary>
/// Summary description for FILE_MASTER_INSERT
/// </summary>
//namespace Linq2Sql_demo_doc
//{
[System.Data.Linq.Mapping.Table(Name = "FILE_MASTER")]
public partial class FILE_MASTER
{
string Tablename;
public FILE_MASTER()
{
//
// TODO: Add constructor logic here
//
}
[System.Data.Linq.Mapping.Column(Name="filename")]
public string FileName
{
get;
set;
}
[System.Data.Linq.Mapping.Column(Name = "file_content")]
public byte[] file_content
{
get;
set;
}
[System.Data.Linq.Mapping.Column(Name = "file_id", IsPrimaryKey = true, IsDbGenerated = true, CanBeNull = false)]
public int file_id
{
get;
set;
}
}
public class TestDB : DataContext
{
public Table<FILE_MASTER> FILE_MASTERs;
//Initializing base class constructor
public TestDB(string s) : base(s) { }
}
This
is working but, this is suitable only for single table, I expect
depending upon the user input automaticaly the tablename, column name
will change in the yellow block codes .
Is there any way to update the tablename , columnname from any other class?
Thank you. Jeyaseelan
View 1 Replies
View Related
Jan 21, 2008
I'm new to ASP/VS/Linq and I'm having a small problem.
I have one table setup in SQL Server Express 2005 through Visual Studio 2008. The table name is "Users" and has three columns (accountID, userName, email). AccountID is the primary key and set to auto incriment. I've added a couple of records by hand and it works.
I have a single form with a button, a label, and two text boxes. The button code is below. After entering some fake data that does not already exist in the database and clicking the button I get this.
Cannot insert explicit value for identity column in table 'Users' when IDENTITY_INSERT is set to OFF.
I understand that it is trying to insert something into the accountID field but I don't understand why since I'm only providing a username and e-mail address to insert.
Your help is greatly appreciated.protected void Button1_Click(object sender, EventArgs e)
{
MyDatabaseDataContext db = new MyDatabaseDataContext();
var query = from u in db.Users
where u.email == txtEmail.Text
select u;
var count = query.Count();
if (count == 0)
{
//Create a new user object.
User newUser = new User();
newUser.username = txtUsername.Text;
newUser.email = txtEmail.Text;
//Add the user to the User table.
db.Users.InsertOnSubmit(newUser);
db.SubmitChanges();
}
else
{
Label1.Text = txtEmail.Text + " already exists in the database.";
}
View 6 Replies
View Related
Jan 27, 2008
I have a supplier table with all my suppliers in it. I list them in a gridview. In this gridview, there is a link next to each record to an edit page. Below is the code for the edit page. Protected Sub Page_Load(ByVal sender As Object, ByVal e As System.EventArgs) Handles Me.Load
Dim db As New orderLinqDataContext
Dim q = db.selectSupplierByID(Request.QueryString("id"))
Dim r As New System.Data.DataTable
txtFriend.Text = q(0).friendlyName
txtAdd1.Text = q(0).address1
txtAdd2.Text = q(0).address2
txtAdd3.Text = q(0).address3
txtAdd4.Text = q(0).address4
txtName.Text = q(0).supplierName
txtPhone.Text = q(0).phone
txtFax.Text = q(0).fax
txtPostCode.Text = q(0).postCode
End Sub
This causes a runtime error. The error is: "The query results cannot be enumerated more than once." and the txtAdd1.text line is highlighted.
How am I supposed to get at the data so I can fill my text boxes this way?
Thanks,
View 5 Replies
View Related
Mar 7, 2008
var query = from cloc in context.t_companylocs where (cloc.ref_company == companyid) && (cloc.street == attributes["Street"]) && (cloc.postalcode == attributes["Postalcode"]) && (cloc.country in countrylist) && (cloc.city in citylist) select cloc; attributes is a Dictionary<string, string> object.countrylist and citylist are List<string> objects.Of course the syntax above doesn't work. It's basically what I'm trying to achieve :-)A quick and dirty solution would be to just drop the two where constraints containing the "where in" statement and handle that part in the following foreach() loop.Can anyone please explain how you would do it properly?regards
View 1 Replies
View Related
Mar 12, 2008
Does anyone have a good example of how to insert data using System.Data.Linq? All the examples I've seen do something likeNorthwindDataContext db = new NorthwindDataContext();var x = new Product() {...}; db.Products.Add(x);db.SubmitChanges(); However I'm not seeing an Add method on System.Data.Linq.Table<T>. Has this changed? Could I somehow not be generating my model correctly?
View 2 Replies
View Related