Jump to content
Medved Trader Forums

Observation and Question


timtrader

Recommended Posts

I've been using MT on a fast new computer for about 4 months now (Intel Core i7-4770).  Over time MT has gradually taken longer and longer to both start up and shut down.  Last week it got to the point of taking about 4 full minutes to start up and about a full minute or more to shut down.  Task manager was showing that the MT.exe process was using about 1.4 GB of space and my free memory would often go down to less than 100 MB and the response time would sometimes lag for a second or two during the trading day. (I currently only have 8 GB on this machine.)  So, I poked around and found out that my SDATA.BIN file had grown to over 4.7 GB in size.  Three days ago I shut down MT, renamed my SDATA.BIN file, started MT back up and ever since then MT fully starts up in about 10 seconds (all data sources streaming), and it shuts down instantly. My free memory now stays at about 2.5 GB and the MT.exe process is usually only taking about 250 MB of memory.  The new SDATA.BIN file that MT created was only about 25 MB after the first day, but now 3 days later it is sitting at about 134 MB, so based on my current usage, I suspect that it will eventually become a huge file and slow things down again.

 

I periodically research over 100 stocks after hours and have my "Max # of Days of Intraday Data" set to 55.  I know that's high, but its handy for research....and this fast machine coupled with the fast quotes from IQFeed will return that 55 days of intraday data in only about 2 seconds or less for a newly entered stock symbol with average daily volume of about 50 million shares.  I assume its this historical data that is causing my SDATA.BIN file to grow so large over time.  So, I have a couple of questions based on this information:

 

1.)  Based on my settings, will MT periodically delete the data older than 55 days in the SDATA.BIN file, or does it just keep all of the historical data and continue to grow the file?

 

2.)  Based on my setting and usage, would you recommend that I add more memory to my machine, or just plan to periodically shut down MT and delete the SDATA.BIN file and let a new smaller file be created? (I know historical data is lost this way, but it doesn't seem to matter much with the speed of this machine.  It appears that the size of the SDATA.BIN file affects the size of the MT.exe executable and the amount of free memory on the machine.)

 

FYI, MT is a great product!  :D

 

Thanks,

Tim

 

Link to comment
Share on other sites

basically, MT has one large memory mapped file for the data. It is divided into pages (chunks of memory). As it needs more to store quotes, etc, it allocates more pages.  If no free pages are available, the file will grow. If you clear data or old data is deleted, those pages are freed up to be used by future data.  However, the file size will not shrink. 

 

So right now if you one day decide to look through 1000 symbols with 55 days of data on them, the file will grow to hold all that data. If you then clear data and from then on use only 10 symbols, the file will not grow any more, but it won't shrink either.  That may be what happened.

 

We plan to add a function to consolidate the SDATA file if the amount of free data pages becomes a significant portion of the file, but have not done it yet. Will do. 

 

Jerry

Link to comment
Share on other sites

  • 1 month later...

Hi,

 

Regarding data files.

 

I track a large # of stocks and i found that in the folder C:\Users\[username]\AppData\Local\2GK\Medved Trader\Temp\Streamers there is a large number of backfill files, including for really old dates, or symbols that I no longer follow (they are no longer in any portfolios.) 

 

Will these files ever get deleted, or will they stay ?

 

I assume it is safe to manually delete them, but maybe the cleanup of the unnecessary backfill files can be part of data consolidation procedures in MT?

Link to comment
Share on other sites

those are for troubleshooting and are only supposed to be created if running with Log level set to 100 or higher. However, I found one spot in Stockwatch configuration where MT was saving them even for log level 10 (default).  I fixed that. You can safely remove them. Next build will have that fixed.

Link to comment
Share on other sites

I've been deleting the data file about once a week to keep the size down.  It was about 600 MB as I read your note.  I just added hundreds of symbols to a portfolio, cycled through each one to fill in the chart data, and deleted them from the portfolio.  My data file is now 1.25 GB.  Will that work for a consolidate test?

 

Thanks,

Tim

Link to comment
Share on other sites

it may, but the way my code checks now, it will not do the consolidation for at least a couple days. Will be changing the way the process is triggered. Instead of being automatic (and slowing down the start process without any user control), thinking I may prompt the user if they want to consolidate on next startup.

Link to comment
Share on other sites

A prompt would be good.  I sometimes research dozens or hundreds of stocks on a weekend and would want to reduce the data file size when I'm done.  I'll leave my data file alone so it will be of decent size in case you need it for testing.

 

Thanks,

Tim

Link to comment
Share on other sites

FYI, I just tested the Consolidate option.  My data file was 1.252 GB before and 1.183 GB after.   The vast majority of that data was from 2 days ago as I mentioned above.  I can run it again in the coming days if you expected the size to be reduced further after more time has passed.  The entire startup time was just over 2 minutes, most of that doing consolidation.

 

Thanks,

Tim

Link to comment
Share on other sites

Basically that means there wasn't much to consolidate - if you do something like clear data on half the stocks and run it, you will notice a big drop.  Or if you remove the symbols from the portfolios, then restart MT, a few days later the data will clear and consolidate after that will make a big difference

 

the idea is that because it is a costly operation (slow), not to try to catch every possible reduction in size as it happens. Instead, there is a code that handles cleanup  that has some logic to cleanup "unused" data - stuff that has not been accessed recently and is not in current portfolios. Old data, etc.  After that runs, consolidate would then check if the file is disproportionately empty pages and if so, it will prompt you to consolidate.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...