Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Dear All,
I am presently working on progress database and i am facing a problem while extracting the data from db.
When i reload my extraction application at some points in some tables it pops up an error odbc read failed
when i do ok it reloads but some data gets missed. i.e QlikView doesnot pulled all data from db. I have figured
out that there are some fields in those tables due to which this error occurs. when i comment these fields QV
pulls all data from db. I looked into the dump of one table provided by db team and found that the culprit field
has data in it which exceeds the coloumn size. But i cant comment that field as it is required in the reports.
Do any body have any idea how to tackle the situation.
Thanks in Advance.
Regards
Lalit
field length could also be checked (if it is correct) if you try to load your records via excel-odbc: it will fail in the same way.
Please read
http://documentation.progress.com/output/OpenEdge102b/pdfs/dmsdv/dmsdv.pdf
to see how to use the dbtool (we used the option 2) by manual execution on a non regular base. I dit noticed this problem as a rebuild did not show all the records i expected. (record extraction count stayed the same ...)
Using this DBTool will solve your problem! On our 5 GByte Progress database, it took about 5 minutes to complete.
Other helpfull links are:
http://stackoverflow.com/questions/6193275/progress-odbc-problem-with-sql-width-dbtool
http://www.sonicsoftware.com/products/documentation/docs/mq_install.pdf
My own humble question : as one of the links suggest : This tool is recommended by a user to be performed once a month. Does anyone know how to use this as a command line utility so that it could be sceduled? If so , please share this code
Lalit
Sorry I've got confused between Progress ODBC and Postgres ODBC.
We did work with Progress as well a while ago
Mike
Its Ok Mike.
I am working on Progress database.
Any suggestions on the same issue would be helpful.
Lalit
We have Progress OpenEdge installed on a server running Linux.
I have a program that I have scheduled to runs once a week this code:
UNIX SILENT VALUE ('dbtool /users/colontw/db/XCOLONTW < /users/test/jeroeno/dbtoolinput.txt 1> /dev/null 2> /users/test/jeroeno/dbtoollog.txt').
dbtoolinput.txt is filled with the options you want to use for dbtool.
for example:
2
1
all
all
1
and with 1> and 2> you can redirect the output to an other location than your screen.
Has any one solved thisproblem?
Is this problem also related to a JDBC connection? Maybe this could be a solution..
Some ODBC drivers are tricky to work with
If there is a JDBC driver you can try it instead together with Ralph's JDBC connector
or you can try our ETL Tools - Qlik View Connector it works with ODBC as well
Mike
Character fields in progress are variable length and ignore the SQL width setting in the DB schema (there is a start-up parameter where you can force Progress to adhere to it - can't remember it off the top of my head).
One of the records in your address table (field address-1) has more characters in it than the setting of the SQL width for the field.
Two options:
Update the Progress database schema so the SQL width is > the maximum length of the field, or reduce the text in the offending record.
Do a search on "openedge dbtool" - dbtool is a tool supplied by Progress that can update the SQL width settings for you.
Lalit Chaudary
Same problem reading a Progress 10.2B Database:
QVX_UNEXPECTED_END_OF_DATA: SQL##f - SqlState: S1000, ErrorCode: 4294757284, ErrorMsg: [DataDirect][ODBC Progress OpenEdge Wire Protocol driver][OPENEDGE]Column num_factura in table PUB.Docs-x-pagar has value exceeding its max length or precision.
What of all the posts is the correct answer?
Dear Emmanuelle,
In progress DB there is no limitation on the length of the characters that can be inserted in the field. e.g Let's say there is a table A in progress which has below fields
Field Name Field1 Field2 Field3 Field4
Length(CHAR) 100 50 50 29
So in progress db you can enter the data which could be of more length then defined....in such cases the mentioned error come. The way to tackle is to increase the length of the fields in the progress db in which the error is coming so ask the db team to increase the length of the field due to which the error is coming..........
Regards
Lalit