On Fri, Dec 29, 2017 at 12:20 PM, < mbsoftwaresolutions@mbsoftwaresolutions.com> wrote:
On 2017-12-29 12:00, Stephen Russell wrote:
I think you missed my point. this is a separate product you are creating that will pull data from VFP and only store it online. You can then build up a new style of reporting outside of VFP. Once they approve of it you can delete the data older than 4 years or 7 years from the VFP environment. The Data Scientist role may suit you very well.
Check out this: https://www.zs.com/services/technology/technology-services/ big-data-and-data-scientist-services.aspx
Thanks for the link. Data Scientist is definitely a job for the future and big $$$ I think! My original post was not so much about reporting; it was about hauling tons of data over the LAN that's no longer relevant and the archival process I proposed to resolve that (without permanently deleting data). As we said in a previous thread, the whole DBF isn't coming across the network unless you've got commands like REPLACE ALL etc. and unfortunately in the case of my previous Corporate app at Sylvan, there's tons of old code from previous devs where they did that often.
My company won't get rid of transactional data and the size of the 4 GL tables that are in my way is roughly 600 gigs. Finding specific data in them that is outside the scope of my indexes takes way too long. It also takes additional time to do a restore as well. In 2018 we are porting to an upgrade on a new data server so it will take each test more time to restore as well.
To me, IRS data demands past 7 years. I'll keep 8 and remove 3 years of older transactions now because we already have them in the DW as well.
Welcome to my monthly fight. :)