Difference between revisions of "Termination Competition 2020"

From Termination-Portal.org
Jump to navigationJump to search
(→‎Contact: clearly state "ask termtools")
(→‎Status: final result)
 
(3 intermediate revisions by the same user not shown)
Line 8: Line 8:
  
 
== Status ==
 
== Status ==
[http://termcomp2020.herokuapp.com/ Status]
+
[https://termcomp.github.io/Y2020/ Final Result]
 +
[https://termcomp.github.io/Y2020-1/ Initial Result]
  
 
== Dates  ==
 
== Dates  ==
Line 75: Line 76:
 
== Registration ==
 
== Registration ==
  
Participants must be registered on Starexec (so you can upload and test your solver): enter your data at [https://www.starexec.org/starexec/public/registration.jsp StarExecRegistration], indicating the competition categories where  you plan to enter tools and problems, and then upload your contributions to [http://www.starexec.org StarExec]
+
To submit a tool, please follow the instruction at [https://github.com/TermCOMP/starexec-master/ the TermCOMP web repository].
  
Then, please email the competition's organizer (A. Yamada)
+
To submit a benchmark, please follow the instruction at [https://github.com/TermCOMP/TPDB TPDB repository].
indicating which categories you want to participate and the startexec IDs of your account, your tool, and a configuration per category.
 
Participants of the previous competition can just indicate so if they want to participate in the same categories.
 
  
 
We recommend to register early. After the deadline, access to [http://www.starexec.org StarExec] might be restricted. We need time to prepare the competition, and other competitions may be running in parallel.
 
We recommend to register early. After the deadline, access to [http://www.starexec.org StarExec] might be restricted. We need time to prepare the competition, and other competitions may be running in parallel.

Latest revision as of 09:55, 3 July 2020

In 2020, the Termination and Complexity Competition will be affiliated with IJCAR.

The competition will be run on the StarExec platform shortly before IJCAR, followed by a bug/conflict reporting phase and allowance for bug fixes. The final results, as well as contributions in bug reports, will be reported online at IJCAR.

Status

Final Result Initial Result

Dates

  • Tool and Problem Submission: June 19, 2020 (AoE)
  • First Run: June 20
  • Bug/Conflict Report Deadline: June 25
  • Bugfix Deadline: June 28
  • Presentation of Results: July 4

Competition Categories and Awards

The competition contains several categories for termination and complexity from the areas of term rewriting (all categories with HO/FO-TRSs, with- or without strategies) and programming languages (Logic Programming, Haskell, Java, C, integer transition systems, ...)

Proposals for new categories are welcome and will be discussed in the mailing list.


A category is only run at the competition if there are at least 2 participants and at least 40 examples for this category in the underlying termination problem data base.


Competition Procedure

All participants in the same category will be run on the existing problems of this category. There might be modifications of the rules suggested by the organizer and decided by the SC.

The wall-clock timeout will be 300 seconds, and 4 cores will be available (if a tool wants to use concurrent execution). The tools will be started in their directory and obtain

The tools are expected to give an answer (YES, NO, MAYBE) in the first line on stdout, followed by a proof in ASCII, HTML, or CPF format. Exceptions to these rules are the certified (see also the CPF-website) and complexity categories. See all existing categories for more details.

For those categories devoted to prove only termination or non-termination, the score of a tool is the number of non-contradictory answers minus the given penalization (if any).

A proof or answer that is known to be wrong will be penalized by -10 points, if it remains after the bugfix deadline. The SC will decide what should be penalized, depending on the discussion among the participants.

Steering Committee

  • Florian Frohn, MPI Saarbrücken
  • Jürgen Giesl, RWTH Aachen
  • Georg Moser, University of Innsbruck
  • Albert Rubio (Chair), UPC Barcelona
  • Akihisa Yamada (Organizer), AIST Tsukuba

Registration

To submit a tool, please follow the instruction at the TermCOMP web repository.

To submit a benchmark, please follow the instruction at TPDB repository.

We recommend to register early. After the deadline, access to StarExec might be restricted. We need time to prepare the competition, and other competitions may be running in parallel.

It is highly recommended that participants also subscribe to the termtools mailing list, because that is where announcements will be made, and where discussion takes place.

StarExec Information

This refers to StarExec in general, and not to Termination or Complexity in particular.

Technical Details

The competition will be running on StarExec - a cross-community solver (tool) execution and benchmark (problem) library service under joint development (since 2012) at the University of Iowa and the University of Miami.

Competition data will be presented via starexec-master (a successor of star-exec-presenter developed at HTWK Leipzig).

Technical details about the execution platform (as of 2014) can be found here.


Contact

Any questions or suggestions regarding the termination competition should go to, and discussed at termtools<at>lri.fr.

To contact the steering committee, send an email to terminationcompetitionsc<at>lists.rwth-aachen.de.