Try adding "no eof" to your format specification and see if that makes a difference. I've never tried to read that many ( 2 725 462) bytes into a single field, so you may have other problems as well. Can you process this as individual rows? If you are trying to tie multiple lines together, you can do that by assigning a logical record number to each and use concat() to group them back together,
I think Rob is right and you shouldn't load everything within a single field-value else loading them row by row, maybe with an included recno(), rowno() and maybe some further counter per peek() or previous() in a following ordered resident load or if the datastructure is consistent (or could be mode) with something like:
mod(rowno(), 12) as FieldID,
ceil(rowno() / 12) as RecordID
which I used for a logfile which had a stream-format and each 12 rows belonged to one record. If this isn't helpful in any way please elaborate what is the intention to keep the whole content within a single field-value.