Join the 80,000 other DTN customers who enjoy the fastest, most reliable data available. There is no better value than DTN!

(Move your cursor to this area to pause scrolling)




"I was with ******* for 4 years at $230 a month, this is a huge savings for me, GOD BLESS YOU PEOPLE," - Comment from T.S. via Email
"I am very happy I changed. I love the product, but more so I am thrilled with Tech Support. You are knowledgeable, polite, pleasant and professional." - Comment from Pat
"I've never had DTN go out on me since switching. ******* would go down a couple times every month when I was using them." - Comment from Bryce in AL.
"I use IQ Feed, Great stuff as far as data analysis information, storage and retrieval is concerned." - Comment from Public Forum
"Thank you so much - awesome feed, awesome service!" - Comment from Greg via Email
"I will tell others who want to go into trading that DTN ProphetX is an invaluable tool, I don't think anyone can trade without it..." - Comment from Luther
"Just a thank you for the very helpful and prompt assistance and services. You provided me with noticeably superior service in my setup compared to a couple of other options I had looked at." - Comment from John
"I am very pleased with the DTNIQ system for quotes and news." - Comment from Larry
"If you want customer service that answers the phone, your best bet is IQFeed. I cannot stop praising them or their technical support. They are always there for you, and they are quick. I have used ****** too but the best value is IQFeed." - Comment from Public Forum
"DTN has never given me problems. It is incredibly stable. In fact I've occasionally lost the data feed from Interactive Brokers, but still been able to trade because I'm getting good data from DTN." - Comment from Leighton
Home  Search  Register  Login  Recent Posts

Information on DTN's Industries:
DTN Oil & Gas | DTN Trading | DTN Agriculture | DTN Weather
Follow DTNMarkets on Twitter
DTN.IQ/IQFeed on Twitter
DTN News and Analysis on Twitter
»Forums Index »Archive (2017 and earlier) »IQFeed Developer Support »Too much data for my program to handle!
Author Topic: Too much data for my program to handle! (7 messages, Page 1 of 1)

brandon
-Interested User-
Posts: 19
Joined: Jan 4, 2006


Posted: Jan 13, 2006 09:22 PM          Msg. 1 of 7
When monitoring approximately 100 or so ACTIVE issues, I have quite a bottleneck; my program seems to not be capable of reading in line-by-line at nearly the speed that data is received, and the queue seems to fill up considerably. I am reading in one line at a time from the update server (port 5009) by read()ing data from the socket until a CRLF is received; after receiving a line, it parses it by the commas, creates the struct, fills in the value, and passes it to each of my threads analyzing the data.

For instance, I know that the client is receiving update lines every time a bid/ask changes (as part of the update types; one for tick, one for after hours, one for bid , one for ask, etc) and in active issues such as GOOG (where even with my normal display software it's hard to keep up with all the changes!) it causes an excessive amount of traffic.

Having said this, a couple of questions and theories:

* Is there anyway to limit this amount of data? Can you send me an update only when a tick occurs?
* Would establishing multiple connections to the update server help? E.g., for every 10 symbols, have one connection? Connection 1 would monitor A, B, C, D, E, F, G, H, I, & J, connection 2 K, L, M, N, O, P, Q, R, S & T, and so forth. Could this help the problem?
* Having me write a seperate program to connect to the feed (10 symbols per connection), and on a tick change (up or down) send the message down the pipe to my analysis program? This way, it's only reading tick changes as opposed to EVERYTHING.

Would any of this help? Being new to IQFeed, and not having access to the source code or knowledge of the way the backend operates, I don't know what is the best method for monitoring data. I am asking this question of the programmers of IQFeed and others who have used IQFeed to power their program with no lag.

Thanks in advance!

Brandon

kdresser
-Interested User-
Posts: 71
Joined: Nov 25, 2004


Posted: Jan 14, 2006 12:17 AM          Msg. 2 of 7
Hello,

You will have to do any data limiting you require within your own processing.

It sounds like you've got a program bottleneck that you need to uncover and fix. Using more connections will probably hurt rather than help.

It took me a while to experiment and optimize things, but with a 2.3GHz machine I can receive the data from 500 naz symbols and do the processing you describe in your first paragraph, and more, but use less than 20% of the CPU on average.

Kelly

brandon
-Interested User-
Posts: 19
Joined: Jan 4, 2006


Posted: Jan 14, 2006 12:19 AM          Msg. 3 of 7
That's the bizaare thing -- I don't even use a percent or so! Perhaps I'm using usleep() too liberally? And any ideas on why it would hurt to use multiple connections?

Regards,

Brandon

kdresser
-Interested User-
Posts: 71
Joined: Nov 25, 2004


Posted: Jan 14, 2006 07:12 AM          Msg. 4 of 7
Brandon,

I said it might hurt cuz, in my experience, doing more a wrong thing usually doesn't help matters and the added complexity will cost something.

Sleeping doesn't sound like a good thing to be doing while you are being hosed down by a big stream of data. <grin>

Start small -- first do nothing but log the raw received stream to a file. Then you can experiment by reading that file to discover how best to crunch it.

Kelly

nsolot
-DTN Guru-
Posts: 273
Joined: Sep 4, 2004


Posted: Jan 14, 2006 08:05 AM          Msg. 5 of 7
Brandon,

I agree with Kelly. I'm wacthing QQQQ which I'd guess to be more active than GOOG with no problems. Also I have regional quotes enabled.

Which language are you using? I use C++.

My guess is the sleep() is troublesome. In my testing sleep causes not only the function in which the call is made to sleep, but other functions to suspend processing as well. As a workaround, when I want a particular function to wait a bit, I throw up a dialog, set a timer and then dismiss the dialog.

Ned

stargrazer
-DTN Guru-
Posts: 302
Joined: Jun 13, 2005

Right Here & Now


Posted: Jan 14, 2006 09:52 AM          Msg. 6 of 7
The way I approached this problem is to use a different programming paradigm. Instead of running polling/waiting loops, it might more efficient to run an event driven model, possible in two or more threads: one thread for interface related activities, and background threads for handling data arrival asynchronously.

This style of programming does require extra effort, but it will pay off handsomely in the long run.

The trick is to tie a subroutine to the socket upon which your data is arriving. As data arrives in blocks, the socket automatically calls your event handler. Your handler parse the blocks into lines, and then signal events one per line recieved. Once it is finished with the block, it returns and lets the operating system socket handler take over again.

You have to be aware that sockets send data in chunks, not necessarily on nice eol boundaries so you may have to do some buffering yourself while waiting for the socket to get the eol to you.

This lets the operating system take care of the 'sleep' time automatically.

Hopefully this helps with your optimization.

squirlhntr
-Interested User-
Posts: 62
Joined: Feb 12, 2005


Posted: Jan 19, 2006 04:42 PM          Msg. 7 of 7
Yeah don't use sleep(). And don't wait for things to process; make your process asyncronous.

Use an event-driven organization and pass events between threads. If you're using C++ you might want to take a look at the free and excellent wxwidgets library.

personally i write in python and twisted is superb for these sorts of things.
 

 

Time: Wed May 8, 2024 2:02 AM CFBB v1.2.0 16 ms.
© AderSoftware 2002-2003