I don't do any search in my application into documents. Problem is that there are much to many documents (I have customers with more than 1 million documents). So they use the standard Microsoft search which is doing fine.
... MESSAGE hWnd. * Samples\oemboxch.prg: How to view oem box chars in a Winodws application. * samples\mariabig.prg: Read and Browse table with 4.1 million records from mysql cloud server using RecSet() class FWHMaria lib * Enhancement: Class TWebView() new METHOD Dispatch( bAction ) where bAction ...
We now have a table 'custbig' on our FWH Cloud server with 4.1 million records. Let us see the speed. This speed is by accessing the data from a Server in NewYork over Internet. Speeds on local area network will be even far better. #include "fivewin.ch"function ...
... generated Training after 18.21 minutes, Checkpoint 2000 generated Checkpoint 1600 and 1800 will be transferred to wormhole.app. But we need 1.6 million runs. So the save_steps need to be optimized and perhaps more.
... processing). For the size of data tables most of use, no indexes are ever necessary. We need to think about them when we are dealing with multi-million records tables. Now, as for Seek. If you are using FWMariaDB for MySql or MariaDB, this works with single field. Note: No indexes need to be ...
... be very fast. Some time ago I did some tests regarding this using a large database and it took 0.32 seconds to collect 20,543 records out of a one million record database across a LAN! So, the display of one screenful of data is the bottleneck. So as Nages said, you may have to compromise some ...
hi, you know that Listview can handle 100 Million Element ... which other Control can handle so much :?: as there is only 1 Sample using Listview-Group i think Fivewin User does not know what Listview can do and how fast i can be. but i have to ...
... scope and then the low and high values you want to see using oDB:setScope(xLow,xHigh). The only records read will be those you want. I tested a 1 million record DBF file using a scope to filter out about 20,000 records and it takes less than a second! Using a filter instead takes 23 seconds.
... browse program, all from One Drive, and they are each just as fast as the first one. They are all opened in shared mode and browsing the same 1 million record file with a scope. Each one opens in less than a second. as i can say DBF is faster when work "Record-based" e.g. when SEEK() ...
Note that even running from One Drive, it was still less than a second to open a browse showing a screenful of records from a subset (20,543) of 1 million records from the database.
Here are my DBF speed tests. In all cases I am running the EXE from the same drive as the database. From a 1 million record DBF it takes 0.39 seconds to find 20,543 records (all the addresses in California) from a network USB drive on my LAN. I used a scope for the test. Even ...
... you to always use setOrder(1), to search the primary-key field in any database. My ID fields are usually 6 digits, as that handles one less than a million numbers. I honestly don't understand what your common thread is and what the reservation number has to do with it. No government requirements. ...
... you to always use setOrder(1), to search the primary-key field in any database. My ID fields are usually 6 digits, as that handles one less than a million numbers. the recno is needed because I am in modification of that record and when I search for that information it should not be considered ...
... that non-English speaking users have. The entry takes approx. 2 seconds longer without taking the typing error into account. That's 100s of million Euros that our economys wastes a year. One must always be vigilant. Best regards, Otto honestly, without controversy, I did not understand this ...