New build process: keep it simple

(Old) Problems

For Openbravo ERP developers (specially for newcomers) it has always been difficult to decide which of the available build tasks was the best election to build the system after any development they had done. We had to take into account what modifications were done to know which ant task we should run. For example if we had modified a window we would use ant compile -Dtab=myWindow to generate the code just for that window and not for the rest of them.  It was even worse when working with subversion, each time we updated our working copy we had to look which were the files that had been updated to know if it was necessary to run update.database to synchronize Openbravo model database(database schema objects and applicatoin dictionary data) from XML files. And in case there were modifications there it was worth re-generate all the WAD windows because it was difficult to know which ones had been modified. So many times compile.complete was the “safe” but slow choice.

Upadate.database task had two more inconvenients: the first one was that in case Openbravo model was modified locally and not exported (ant export.database), when executing this task we would lose all the changes done in database for application dictionary. This annoying behavior had been reported as a bug. The other inconvenient was that during the r2.50 development cycle, specially because of the usage of DAL as part of the update process, this task was pretty unstable, making people not to be very confident about using it. As result people felt safer by recreating the whole system  (ant install.source) instead of executing a much faster incremental build (ant update.database compile.complete).

New task: smartbuild

For Openbravo ERP r2.50 we have resolved these problems by simplifying the build process with a new incremental build task: smartbuild, which is currently available in trunk (r12753) and will be release in the next alpha (aplha r11). This task performs all the required processes to build your  system but only the required ones, with a huge improvement in performance. It checks whether the database needs to be updated from xml sources and performs the update only if needed, generates the code that needs to be regenerated, compiles and deploys it.

The goal of this smartbuild is to replace most of the rest of tasks, making life a little simpler for developers. So now it is only needed to use two tasks: smartbuild for all the builds and export.database to export database to xml files. export.database is now smart to export only if needed, skipping the process if no changes have happened in the local Openbravo model.

Moreover update.database ensures before updating that no local changes have occured in Openbravo model since the last synchronization (export.database or update.database) to prevent people loosing their changes. In case of changes, people will be required to export their database before updating it.

How it works

  • Determine if database needs to be updated. To do this smartbuild generates a checksum for the xml files and compares it with an existent one. This one is generated each time database is synchronized from xml files or to xml files. If the two checksums are different it means that xml files are different so database is updated.
  • Decide which code needs to be re-generated. Whenever a build process is done a timestamp with the current time is stored in database. This timestamp is compared with the audit info for the application dictionary objects that participate in the code generation so now WAD is able to generate code only for those elements that have been created or modified after the last build. Additionally when exporting database to xml files the audit info is not longer exported and when updating the audit info is recalculated for the current time, thus it also works in case the modifications in application dictionary came from an update.database. There’s only one case when this check doesn’t work: it is when application dictionary elements are modified directly in databse through insert/update SQL statements without updating audit info. In this case the developer will have to generate the code in the old way (using compile -Dtab=modifiedWindows).
  • Check if database has been changed. This check allows to export only if there’re changes in database and prevents data loses when updating database. To check this it is used the same timestamp as in the first point. Modifications in data are calculeted by DAL and modifications in database structure are queried directly to database. The query for database structure last modification has no problem in Oracle becuase User_Objects table stores the last physical change for each database object, but in PostgreSQL that information is not stored in database. This has been solved for PostgreSQL generating a checksum in database from all the elements in database that can be exported to xml files, that’s the reason why in PostgreSQL this check takes longer than in Oracle.
Advertisements