I've seen long-time softwares with VFP backends that had a ton of data (10+ years worth) and I had devised a method in one case recently to be able to "archive" old data by storing it in a subfolder intelligently (so it could be easily retrieved and/or reimported into the main data set). I haven't used a VFP backend since 2004 when Bob Lee introduced me to the MySQL world but nonetheless I thought I'd ask if devs here ever put anything like an "archiving" feature into their software, and how they do it. In my case, instead of slinging 600MB of data across the network (in the case of one of my clients), my archiving showed a reduction of like 75%, so only 25% of that was being pulled across the LAN instead. (They didn't need all the data from the beginning of the App's time...they just needed relevant/recent data.)
I realize that with MySQL and other such RDBMSes this is a non-issue, but I wanted to ask the VFP-backend folks their approach to this for the sake of (hopefully) interesting discussion. One final juicy thread before 2017 is finished. :-)
Hi,
Advise you to implent Craig Boyd's class, which you can find at http://www.sweetpotatosoftware.com/blog/index.php/2008/02/22/vfp-database-ba...
Rgds,
Koen
2017-12-29 16:47 GMT+01:00 mbsoftwaresolutions@mbsoftwaresolutions.com:
I've seen long-time softwares with VFP backends that had a ton of data (10+ years worth) and I had devised a method in one case recently to be able to "archive" old data by storing it in a subfolder intelligently (so it could be easily retrieved and/or reimported into the main data set). I haven't used a VFP backend since 2004 when Bob Lee introduced me to the MySQL world but nonetheless I thought I'd ask if devs here ever put anything like an "archiving" feature into their software, and how they do it. In my case, instead of slinging 600MB of data across the network (in the case of one of my clients), my archiving showed a reduction of like 75%, so only 25% of that was being pulled across the LAN instead. (They didn't need all the data from the beginning of the App's time...they just needed relevant/recent data.)
I realize that with MySQL and other such RDBMSes this is a non-issue, but I wanted to ask the VFP-backend folks their approach to this for the sake of (hopefully) interesting discussion. One final juicy thread before 2017 is finished. :-)
[excessive quoting removed by server]
Hi,
That's nice for backups, Koen, but that's not what this post was about. Still, thanks for the link!
Cheers, --Mike
On 2017-12-29 11:00, Koen Piller wrote:
Hi,
Advise you to implent Craig Boyd's class, which you can find at http://www.sweetpotatosoftware.com/blog/index.php/2008/02/22/vfp-database-ba...
Rgds,
Koen
2017-12-29 16:47 GMT+01:00 mbsoftwaresolutions@mbsoftwaresolutions.com:
I've seen long-time softwares with VFP backends that had a ton of data (10+ years worth) and I had devised a method in one case recently to be able to "archive" old data by storing it in a subfolder intelligently (so it could be easily retrieved and/or reimported into the main data set). I haven't used a VFP backend since 2004 when Bob Lee introduced me to the MySQL world but nonetheless I thought I'd ask if devs here ever put anything like an "archiving" feature into their software, and how they do it. In my case, instead of slinging 600MB of data across the network (in the case of one of my clients), my archiving showed a reduction of like 75%, so only 25% of that was being pulled across the LAN instead. (They didn't need all the data from the beginning of the App's time...they just needed relevant/recent data.)
I realize that with MySQL and other such RDBMSes this is a non-issue, but I wanted to ask the VFP-backend folks their approach to this for the sake of (hopefully) interesting discussion. One final juicy thread before 2017 is finished. :-)
[excessive quoting removed by server]
Why not create a Data Warehouse for that data and archive it that way. You can put that into mySQL and remove it from .dbfs at the same time.
Create your Fact and Dimension tables to contain the true data needed for your DW over the long haul. Then you can investigate a variety of tools to enable Data Analytics going forward. This might give you some ideas on that. https://blog.capterra.com/free-and-open-source-data-visualization-tools/
On Fri, Dec 29, 2017 at 9:47 AM, < mbsoftwaresolutions@mbsoftwaresolutions.com> wrote:
I've seen long-time softwares with VFP backends that had a ton of data (10+ years worth) and I had devised a method in one case recently to be able to "archive" old data by storing it in a subfolder intelligently (so it could be easily retrieved and/or reimported into the main data set). I haven't used a VFP backend since 2004 when Bob Lee introduced me to the MySQL world but nonetheless I thought I'd ask if devs here ever put anything like an "archiving" feature into their software, and how they do it. In my case, instead of slinging 600MB of data across the network (in the case of one of my clients), my archiving showed a reduction of like 75%, so only 25% of that was being pulled across the LAN instead. (They didn't need all the data from the beginning of the App's time...they just needed relevant/recent data.)
I realize that with MySQL and other such RDBMSes this is a non-issue, but I wanted to ask the VFP-backend folks their approach to this for the sake of (hopefully) interesting discussion. One final juicy thread before 2017 is finished. :-)
[excessive quoting removed by server]
Hi Steve,
Again, it's not an issue with non-VFP backends like MySQL. This was about the VFP-backend folks. If I were to redesign an app, I would use MariaDB as the backend if I could.
HNY! --Mike
On 2017-12-29 11:26, Stephen Russell wrote:
Why not create a Data Warehouse for that data and archive it that way. You can put that into mySQL and remove it from .dbfs at the same time.
Create your Fact and Dimension tables to contain the true data needed for your DW over the long haul. Then you can investigate a variety of tools to enable Data Analytics going forward. This might give you some ideas on that. https://blog.capterra.com/free-and-open-source-data-visualization-tools/
On Fri, Dec 29, 2017 at 9:47 AM, < mbsoftwaresolutions@mbsoftwaresolutions.com> wrote:
I've seen long-time softwares with VFP backends that had a ton of data (10+ years worth) and I had devised a method in one case recently to be able to "archive" old data by storing it in a subfolder intelligently (so it could be easily retrieved and/or reimported into the main data set). I haven't used a VFP backend since 2004 when Bob Lee introduced me to the MySQL world but nonetheless I thought I'd ask if devs here ever put anything like an "archiving" feature into their software, and how they do it. In my case, instead of slinging 600MB of data across the network (in the case of one of my clients), my archiving showed a reduction of like 75%, so only 25% of that was being pulled across the LAN instead. (They didn't need all the data from the beginning of the App's time...they just needed relevant/recent data.)
I realize that with MySQL and other such RDBMSes this is a non-issue, but I wanted to ask the VFP-backend folks their approach to this for the sake of (hopefully) interesting discussion. One final juicy thread before 2017 is finished. :-)
[excessive quoting removed by server]
I think you missed my point. this is a separate product you are creating that will pull data from VFP and only store it online. You can then build up a new style of reporting outside of VFP. Once they approve of it you can delete the data older than 4 years or 7 years from the VFP environment. The Data Scientist role may suit you very well.
Check out this: https://www.zs.com/services/technology/technology-services/big-data-and-data...
On Fri, Dec 29, 2017 at 10:34 AM, < mbsoftwaresolutions@mbsoftwaresolutions.com> wrote:
Hi Steve,
Again, it's not an issue with non-VFP backends like MySQL. This was about the VFP-backend folks. If I were to redesign an app, I would use MariaDB as the backend if I could.
HNY! --Mike
On 2017-12-29 11:26, Stephen Russell wrote:
Why not create a Data Warehouse for that data and archive it that way. You can put that into mySQL and remove it from .dbfs at the same time.
Create your Fact and Dimension tables to contain the true data needed for your DW over the long haul. Then you can investigate a variety of tools to enable Data Analytics going forward. This might give you some ideas on that. https://blog.capterra.com/free-and-open-source-data-visualization-tools/
On Fri, Dec 29, 2017 at 9:47 AM, < mbsoftwaresolutions@mbsoftwaresolutions.com> wrote:
I've seen long-time softwares with VFP backends that had a ton of data
(10+ years worth) and I had devised a method in one case recently to be able to "archive" old data by storing it in a subfolder intelligently (so it could be easily retrieved and/or reimported into the main data set). I haven't used a VFP backend since 2004 when Bob Lee introduced me to the MySQL world but nonetheless I thought I'd ask if devs here ever put anything like an "archiving" feature into their software, and how they do it. In my case, instead of slinging 600MB of data across the network (in the case of one of my clients), my archiving showed a reduction of like 75%, so only 25% of that was being pulled across the LAN instead. (They didn't need all the data from the beginning of the App's time...they just needed relevant/recent data.)
I realize that with MySQL and other such RDBMSes this is a non-issue, but I wanted to ask the VFP-backend folks their approach to this for the sake of (hopefully) interesting discussion. One final juicy thread before 2017 is finished. :-)
[excessive quoting removed by server]
On 2017-12-29 12:00, Stephen Russell wrote:
I think you missed my point. this is a separate product you are creating that will pull data from VFP and only store it online. You can then build up a new style of reporting outside of VFP. Once they approve of it you can delete the data older than 4 years or 7 years from the VFP environment. The Data Scientist role may suit you very well.
Check out this: https://www.zs.com/services/technology/technology-services/big-data-and-data...
Thanks for the link. Data Scientist is definitely a job for the future and big $$$ I think! My original post was not so much about reporting; it was about hauling tons of data over the LAN that's no longer relevant and the archival process I proposed to resolve that (without permanently deleting data). As we said in a previous thread, the whole DBF isn't coming across the network unless you've got commands like REPLACE ALL etc. and unfortunately in the case of my previous Corporate app at Sylvan, there's tons of old code from previous devs where they did that often.
On Fri, Dec 29, 2017 at 12:20 PM, < mbsoftwaresolutions@mbsoftwaresolutions.com> wrote:
On 2017-12-29 12:00, Stephen Russell wrote:
I think you missed my point. this is a separate product you are creating that will pull data from VFP and only store it online. You can then build up a new style of reporting outside of VFP. Once they approve of it you can delete the data older than 4 years or 7 years from the VFP environment. The Data Scientist role may suit you very well.
Check out this: https://www.zs.com/services/technology/technology-services/ big-data-and-data-scientist-services.aspx
Thanks for the link. Data Scientist is definitely a job for the future and big $$$ I think! My original post was not so much about reporting; it was about hauling tons of data over the LAN that's no longer relevant and the archival process I proposed to resolve that (without permanently deleting data). As we said in a previous thread, the whole DBF isn't coming across the network unless you've got commands like REPLACE ALL etc. and unfortunately in the case of my previous Corporate app at Sylvan, there's tons of old code from previous devs where they did that often.
My company won't get rid of transactional data and the size of the 4 GL tables that are in my way is roughly 600 gigs. Finding specific data in them that is outside the scope of my indexes takes way too long. It also takes additional time to do a restore as well. In 2018 we are porting to an upgrade on a new data server so it will take each test more time to restore as well.
To me, IRS data demands past 7 years. I'll keep 8 and remove 3 years of older transactions now because we already have them in the DW as well.
Welcome to my monthly fight. :)
On 2017-12-29 13:41, Stephen Russell wrote:
My company won't get rid of transactional data and the size of the 4 GL tables that are in my way is roughly 600 gigs. Finding specific data in them that is outside the scope of my indexes takes way too long. It also takes additional time to do a restore as well. In 2018 we are porting to an upgrade on a new data server so it will take each test more time to restore as well.
To me, IRS data demands past 7 years. I'll keep 8 and remove 3 years of older transactions now because we already have them in the DW as well.
Welcome to my monthly fight. :)
Didn't you guys say you're moving things to "The Cloud" ?
Yes, we may do that or we may do OnPrem. Either way we are doing the upgrade. Right now it takes 1.35 hours to restore my database from a backup.
For the cloud potential, I have no idea how long it would take to get our DR group to push a non duped database to the cloud site. to be installed in our TEST site. Is that a 8 hour task or is it closer to 20+ hours just due to the size of 1.2 TB of file size today.
In on Prem I can use a compressed backup file and that takes only 1 hr to copy to the new Test server and restore there locally in an hour and 20 min. We are a few versions off the mark and these ports are a massive time crunch when we have to attempt to run them again and again.
On Fri, Dec 29, 2017 at 12:49 PM, < mbsoftwaresolutions@mbsoftwaresolutions.com> wrote:
On 2017-12-29 13:41, Stephen Russell wrote:
My company won't get rid of transactional data and the size of the 4 GL tables that are in my way is roughly 600 gigs. Finding specific data in them that is outside the scope of my indexes takes way too long. It also takes additional time to do a restore as well. In 2018 we are porting to an upgrade on a new data server so it will take each test more time to restore as well.
To me, IRS data demands past 7 years. I'll keep 8 and remove 3 years of older transactions now because we already have them in the DW as well.
Welcome to my monthly fight. :)
Didn't you guys say you're moving things to "The Cloud" ?
[excessive quoting removed by server]