Here's a quick & dirty idea to resolve your problem.
Read that file in via low-level file handling commands. Then, when you get to that field that's too long - chop it into 2 parts - with another pipe between - and spit it all out to a new Text file. Now you can simply read in that new Text file.
-K-
On 4/20/2017 10:58 AM, Matt Wiedeman wrote:
Hello everyone,
I need to setup a job to import a pipe delimited text file. This is easy enough but one of the fields is larger than 254 characters. If I use a memo field, it does not import that field. I started to setup a routine to step through each character and store the fields manually but I would rather not do it that way.
Does anyone have a function or tip they can share to resolve this situation?
[excessive quoting removed by server]