Versions Compared


  • This line was added.
  • This line was removed.
  • Formatting was changed.

We require Confluence 5.10 and Java 8.

Advanced search is only available when using PostgreSQL


* We use Atlassian's official library for our storage needs, which name is ActiveObjects. They test thoroughly with all vendors, but we've noticed that we sometimes meet bugs with MS SQL and Oracle. Please report any error to us and we'll improve it.

Out of Memory Errors when a matrix is displayed

Building the coverage, dependency and traceability matrixes may use a lot of resources.



See the Global limit for information about size limits




See Performance if you are interested.


Maximum number of requirements 

There is no hard-defined number of requirement, either by space or globally. However Requirement Yogi relies on a library provided by Atlassian for storage, named ActiveObjects, which has a limit of 2,147,483,647 creations per table.

The table which will contain the biggest number of inserts is certainly AO_32F7CE_AOPROPERTY, which contains the properties of requirements. Each time a page is saved, Requirement Yogi drops the requirements and their associated objects, and recreates the ones that still exist in the new version of the page. Since ActiveObjects doesn't reuse the IDs of deleted items, the IDs keep growing. Here is the condition which must always be satisfied:

npages x nedits x nreq x nprop  <  L

npages    Number of pages with requirements
nedits    Number of edits of the page
req      Number of requirements on the page
nprop     Number of properties per requirement
The sum must be made for all pages with requirements. 

L: The limit, 2,147,483,647 (~2 billion)

For example, this allows for 200 pages with 1000 requirements on each page, each having 10 properties, editing those pages 2000 times and still not reaching the limit.

How to solve it, if you reach the limit?

You would be able to resolve the issue by "compacting" the IDs. The operation consists of renumbering primary keys in tables from 0, leaving no gaps, then resetting the database sequences that provide the next available IDsused to be a limit at 2 billion items.

(tick) Since version 3.0 (2021), there is no limit anymore, apart from the speed of the database and indexes.