"Github" Database Connection Error on mdb export

Dear all,
since some days, Cybertracker throws the following error when exporting a ctx database to a mdb file:

CT06 Database Connection Error
FNames should be empty (C:\GitHub\CyberTrackerDesktop\Framework\UnxXMLFiler.pas, line 270)

the same issue pertains across different computers, none of which have a Github folder at that location (C:).

I updated from version 3.517 to 3.520 to see whether an update fixes the issue but it did not fix it.

Any ideas what could be wrong with our CTX file?

Thanks and all the best,
Oliver
ikai.uni-koeln.de

Hi Oliver,

The error message is related to the location in the source code that is faulting. It looks like you have run into a database corruption error.

If you send me your MDB, I can try to figure it out.

Cheers,

-Justin

Hi Justin,

thanks a lot for your support. I just sent an email with download details to you.

I sent two files, a CTX and an MDB. The CTX file seems to run fine, but once we export it to an MDB file the process fails with the mentioned error message. The MDB is the exported file. It has >800 MB, but no connection to the database.

During fieldwork we had some issues because MDB files sometimes did not save properly. They hang during saving until we had to kill to process (after a couple of hours). To fix this we took a previous MDB and updated the application. Then we also realized that we do not need to save the MDB after downloading data from our smartpones, so we did no further changes to the application. Since then, downloading and viewing data worked nicely.

When we came from fieldwork and CT (c. 3.517) had access to the internet it threw some java script errors, which relate to cybertracker.org so we assumed these errors did not affect the database. Then we saved the MDB to a CTX to do some changes on the Report section. When we wanted to export the CTX back to a MDB file the process died with the reported database connection error. I updated to v. 3.520 to see whether it fixes the export issue. Now these java script errors are gone, but the MDB export issue remains.

Since the CTX file fails to export to a MDB file, I wonder which file is corrupt, the CTX, the MDB, or both?

Thanks a lot,
Oliver

Hi Oliver,

I think the issue here is that you have a very high resolution track timer running, and this has resulted in many hundreds of thousands of timer points. If we pare them down a bit, then the database should be much more manageable. Is that an option for you?

Cheers,

-Justin

Hi Justin,
yes indeed we always set the GPS timer to 1 second. We should set it 2 or 3 seconds in the future not to run in the same problem again. We already realised that our CT databases are very slow (e.g. making changes to the report view becomes very slow). But they did not crash so far so we sticked to a 1 second GPS resolution.

Reducing the GPS resolution to some degree would be now problem. If there is a way to remove e.g. every second GPS track point or so that should be fine, so we have at least one GPS point every 2, 3, 4 or 5 seconds. It should not be much less though.

I would be interested to learn how you remove certain GPS points so we can fix this issue in the future on our own.

Another option could be to export the GPS tracks to a shapefile and then remove the tracks entirely from the CT database (and merge them back later in R or GIS) if there is a way to remove the tracks independently from the data points.

Many thanks,
Oliver

Hi Oliver,

I looked at this a bit more and I think the larger issue is that the images you have captured are pretty big. They average around 12Mb and there lots of them. The error you were getting during save was not very clear, but essentially you were running out of memory.

I will shoot you a separate email with the data and images separated.

To reduce the probability of this occurring again, some ideas:

  • Reduce the size of images that you take with the camera
  • Break your data sets down into smaller chunks and work on them independently
  • Decrease the track timer resolution
  • Try and only use CTX to transport smaller files

Unfortunately, the system was built long enough ago that we do not do a great job handling many images at once.

Cheers,

-Justin

Hey Justin,
wonderful, thank you so much! Now the database runs fine again and we are able to use it again. That’s really great.

Yes, we are aware that too many pictures let the database grow fast and that the database becomes slow. But I was not aware that it will crash when its size is larger than > 700 MB. In my eyes this is not even so large for a database these days.

It is so great that Cybertracker supports pictures, videos, voice notes, and high resolution GPS tracking. But if the database crashes if people make use these media in Cybertracker it is not really worth it. I assume that one of the big issues here is that Cybertracker is based on the 32 bit Microsoft MDB engine which is slow and does not use more than 2 GB of RAM even if more is installed on a machine. In the past weeks and months there were quite some others whose CT databases crashed because they just became too large. Given all the abilites Cybertracker has I can imagine that switching to another system is very difficult. But nowadays there are so many different and also faster database engines available. I believe that the Cybertracker software could overcome many of these issues if another (open source) database system would be used.

Do you think there is a possibility that the Cybertracker database engine could be changed from the MDB to another one in the near (or far) future?

Thank yu so much,
Oliver

Hi Oliver,

CyberTracker supports Postgres/MySQL/MS SQL Server, so if your database is going to get large, that is an option.

Having lots of timer points is okay, as long as you do not query them all at once on a map.

Large images are fine, but this makes CTX files of the whole database very large. I think keeping the database in MDB format is correct here.

Cheers,
-Justin

Hi Justin,
you are right, we should try existing technologies first. Installing an SQL server infrastructure on a local laptop is not a problem (we do not have internet/network during fieldwork).

One question before we start setting this up, are media (images, videos, audio files) stored in these server Postgres/MYSQL/MS SQL CT databases as blob or as external files? I ask because if they are stored as blob inside the CT database again, I assume these databases also get large, bulky and slow as well, right?

If this is the case perhaps we should rather remove the camera and audio options from our CT application to keep our local MDB files small.

Thanks a lot,
Oliver

Hi Oliver,

Media files (sound, photos) are stored in a separate table in the database. They are not loaded unless needed in a query. They should not make anything slower.

Media files in the application itself (e.g. with an Image control) are stored along with the screens and this will slow down loading and saving. The solution for that is to put them into Elements and use an “Element Image” control.

The things in your database which made things slow were:

  1. Lots of track timer points in combination with a map query which tried to render them all together. If you did not have a map open displaying the entire date range, then you should have been okay.
  2. Using the CTX system with the entire database. CTX is primarily used for smaller data files.
  3. Unscaled photos. You can have CyberTracker shrink them down during capture if needed.

If you move to SQL Server, here is the section which can help you: https://cybertrackerwiki.org/classic/advanced-topics/#database-servers.

Over time we are working toward an online solution, so we would prefer not to make changes to the Windows Desktop app.

Cheers,

-Justin