Announcement

Collapse
No announcement yet.

Data files with app

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Data files with app

    Hello,

    I'm starting on an app that uses a datafile to find specific files
    on a users computer.

    I'm looking for the best way to create the data file, so the performance
    is optimal. The data file will be static so there won't be any
    changes in it when using the program and the records look like:

    1. ID
    2. Name
    3. Description (<--- Long one)

    Is a file with pure text in it enough to read from or will the performance
    be better when I use something like btree (maybe EZTree ??)
    Maybe other ways to create datafiles with pure text in it?

    Any help is welcome.

    Thanks
    Erwin

    ------------------

  • #2
    Interestingly enough, I'm working on an app that does exactly the same thing with ID3 tags at the moment... I purchased PowerTree from this website and the searches through several thousand rows on any indexed field were instantaneous... and the samples that came with it were practically what I needed almost without modification.

    ------------------
    Troy King
    [email protected]
    Troy King
    katravax at yahoo dot com

    Comment


    • #3

      Hi,

      Since your data never changes, you might want to just create
      a data dll, maybe several of them. And once, this is done,
      you can maintain it for possible future future updates. Plus,
      if you data is not to be tampered with, this allows you to
      add more security and also pack these dll to a smaller size,
      without any loss in use or speed.

      Dll's can handle data very quick and if written with speed in
      mind, can be just a fast as any database.

      My $0.02
      Mike




      ------------------
      mwm
      mwm

      Comment


      • #4
        you always get the fastest database handling if you can use fixed
        record lengths. this allows you to read and edit any record of the
        file almost instantly. in your case, you could declare a type that
        looks like:
        Code:
        type myrec
          id   as long
          name as string * 256
          text as string * 1024  'whatever you need
        end type
        global mydata as myrec
        then you can use get and put together with recnum * len(mydata)
        to get/set data from/to any record directly, using mydata. the
        advantages are speed and easy handling. the disadvantage is that
        each record has a fixed size, here 1284 bytes, even if it only
        contains a few bytes of text.

        dynamic strings are a bit trickier. in my poffs project, a record
        can alter from a few hundred bytes up to >100kb, so in order to
        speed up things, i've created an index file, where each record's
        start and length is registered. then i read the relatively small
        index file into an array first, after which i can pick out the
        address for any record from that array. works fine for reading,
        but not so fine if you want to make changes to/delete a record.

        that's one way of doing it. one can also save everything into a
        textfile and let special characters act like delimiters between
        different fields. the parsecount/parse$ combination works fine
        there, but it can be a bit slower on very large datafiles.

        however, all the above includes reading data from a file. if you
        already know what the data is, the fastest way is to include it
        in a special sub or function, for example by using data/read$. i
        once did a rich edit syntax color sample that uses this technique
        and found the data/read$ combination to be pretty fast.
        (http://www.powerbasic.com/support/pbforums/showthread.php?t=22684-

        Comment


        • #5
          Thanks for your reactions.

          The dll is a very good idea.

          Borje I like the way you did it in the poffs project, very good
          code and the way you did the tabs is also very usefull.

          Anyway I will try it both ways and will look what gives the best
          performance.

          Erwin

          ------------------

          Comment


          • #6
            Hmmm ...
            To put a table into Dll means that you will load whole file into memory.
            If data file isn’t gigant, with the same success it possible to include it as RCDATA or
            to Create classic external file and read it whole into memory.

            Really variable length of records is not a problem. It's only requires to create index array Idx(n + 1), where Idx(i) - offset for record i. Length of record will be Idx(i + 1) - Idx (i).

            If records are sorted, the easiest and fastest search algorithm is division by 2.
            For example, you have 1 mln records.
            First, you compare a key with record 500000. For example, a key is >. If so, your compare a key with record # (500000 + 1000000) = 750000 and so on.
            In average you need to compare only 25 times (Log2) to find a record.

            ------------------

            Comment

            Working...
            X