the engineers at Alaska Software have published something that I never understood.
The problem is that filters (DbSetFilter/SET FILTER TO) are converted to WHERE-statements for the PG-server, which are executed by the server, and that means that you can use almost
nothing in filter expressions. You can't use variables/variable names or workareas, you can't use your own functions or most of the functions provided by Xbase++, you can't build filters that are more complex than simple expressions like "name = 'Miller'". That means: filters are f*cking useless with the PGDBE. What is shown in the ilx-document explains how functions used in Xbase++-filters could be rebuild on the server side as something that is called "stored procedures". These are procedures (functions) programmed for an SQL server. You have to make sure that those functions work like your Xbase++-functions, that they are installed with every PG-server that your customers use and that they get adapted to newer versions of Postgres. Besides, you have to learn a new programming language. I don't think that this really is a solution, and there are lots of functions you can't build as stored procedures. And you still don't have your workareas or variables there. In my humble opinion, this is completely bullshit. As we try to provide all supported database structures in one code (DBFNTX, FOXCDX, both with ADS and PGDBE), we can't overload our code in that way.
But this is really a problem with filters only. If you have a very complex filter expression dealing with UDFs, lots of workareas, results from other filters and so on, you will
never get this working with the PGDBE.
If you do something like this:
Code: Select all
DbSetFilter({||<MyVeryComplexExpression>})
DbGoTop()
DO WHILE !Eof()
* do something with the collected records
DbSkip(1)
ENDDO
You can adapt this code (for all database models) without any lost of whatever if you move it to this code:
Code: Select all
DbGoTop()
DO WHILE !Eof()
IF Eval({||<MyVeryComplexExpression>})
* do something with the collected records
ENDIF
DbSkip(1)
ENDDO
since "MyVeryComplexExpression" is now evaluated on the result set. This works with every code you used before.
If you want to have it more compact, use it this way:
Code: Select all
DbeEval({||IF(Eval({||<MyVeryComplexExpression>}), * do something ',NIL)})
With the standard DBEs, you won't get any negative effect with this, since it's mainly the same as filters work on standard DBEs (a filter expression is evaluated with every move in the table, and records not matching are skipped). But the PGDBE is different - the result set (the "cursor") is loaded into memory, and if your tables are large, a large result set is loaded, which may cost traffic and time. This marks the advantage of how Alaska implemented filters for the PGDBE - they create a much smaller result set, and they do it really fast, even without indexes involved. But, as mentioned above - those filters are very limited. You just can't do what you did before. If you search the Alaska Knowledge Base for "PGDBE", you will find a lot of open PDRs concerning filters, and I don't believe they will ever get closed, as long as Alaska builds a "fallback to local", as they did for the ADS.
But this is only the solution for this kind of usage of filter expressions. We built a function "BuildSQLfilter(bMyExpression)" which works for small tables. This collects the records which match the expression and create a filter expression for them which can be used by the PGDBE. It uses the "__record" field available in every table converted to PDGBE/ISAM and builds a string expression. But this is very limited. There are other strategies for dealing with filters, but in order to create a stable application, we are working on getting rid of all filters left.