Difference between revisions of "Termination Competition 2014"

From Termination-Portal.org
Jump to navigationJump to search
Line 14: Line 14:
 
== Technical Detail ==
 
== Technical Detail ==
  
We intend to run the competition on  
+
We intend to run the competition on [http://www.starexec.org/  StarExec] - a cross-community solver execution and benchmark library service under joint development (since 2012) at the University of Iowa and the University of Miami. Its goal is to facilitate the experimental evaluation of logic solvers and other automated reasoning tools by providing a shared storage and computing infrastructure to store, manage and make available benchmark libraries; execute comparative evaluations; and run solver competitions.
 +
 
 +
For displaying the results of the termination competition, we are building a separate web service.
 +
 
 +
Here is some information for participants:
  
 
* submit solvers for testing the platform [http://lists.lri.fr/pipermail/termtools/2014-February/000940.html]  
 
* submit solvers for testing the platform [http://lists.lri.fr/pipermail/termtools/2014-February/000940.html]  
Line 25: Line 29:
  
 
information on the web GUI (under construction) for presenting results [[Star_Exec_Presenter]]
 
information on the web GUI (under construction) for presenting results [[Star_Exec_Presenter]]
 
  
 
Separate page with impressions from using the demo version of Star Exec, in 2012: [[Star_Exec_Test]]
 
Separate page with impressions from using the demo version of Star Exec, in 2012: [[Star_Exec_Test]]

Revision as of 19:40, 13 May 2014

In 2014, the Termination Competition will be part of the FLoC Olympic Games.

Competition Categories

(to be announced)

Dates (tentative)

  • registration of solvers: June 15
  • updates of registered solvers, submission of new benchmark sets: July 1
  • competition starts: July 15
  • results announced: during FLoC Olympic Games Award Ceremony, Vienna.

Technical Detail

We intend to run the competition on StarExec - a cross-community solver execution and benchmark library service under joint development (since 2012) at the University of Iowa and the University of Miami. Its goal is to facilitate the experimental evaluation of logic solvers and other automated reasoning tools by providing a shared storage and computing infrastructure to store, manage and make available benchmark libraries; execute comparative evaluations; and run solver competitions.

For displaying the results of the termination competition, we are building a separate web service.

Here is some information for participants:

  • submit solvers for testing the platform [1]
  • how to register on starexec [2] NOTE: this registration is for testing, and not for actual competition. There may be service interruptions (e.g., sudden changes in benchmark arrangement).
  • STAREXEC_MAX_MEM only 750 MB? [3] (UPDATE: this was out of date, the limit is actually 64 GB)
  • Beware if your solver writes to stderr. The "postprocessor" (that expects YES|NO in the first line, for termination) gets to see stdout and stderr merged. If you have trace/log output, you can only show it after the actual answer.
  • hardware on starexec: see [4]
  • software on starexec: RHEL 6.3 (so Centos 6.3 should be a close match). Package selection, see [5]
  • java is available on starexec nodes (jdk 1.6.0_41) - see [6]

information on the web GUI (under construction) for presenting results Star_Exec_Presenter

Separate page with impressions from using the demo version of Star Exec, in 2012: Star_Exec_Test