Difference between revisions of "Termination Competition 2022"
(Details of the ranking) |
(Final Result) |
||
(One intermediate revision by one other user not shown) | |||
Line 11: | Line 11: | ||
All (co-)authors of tools or benchmarks are invited (but not required) to give a 5-20min presentation of their contributions at the [https://sws.cs.ru.nl/WST2022 Workshop of Termination (WST 2022)] on 12 August 2022. | All (co-)authors of tools or benchmarks are invited (but not required) to give a 5-20min presentation of their contributions at the [https://sws.cs.ru.nl/WST2022 Workshop of Termination (WST 2022)] on 12 August 2022. | ||
We ask for a title+abstract by 18 June via [https://easychair.org/conferences/?conf=wst22 this submission site]. Anyone who sends this in may additionally submit a 1-2 page paper by July 11, to be included in the WST-proceedings (but this is not mandatory). | We ask for a title+abstract by 18 June via [https://easychair.org/conferences/?conf=wst22 this submission site]. Anyone who sends this in may additionally submit a 1-2 page paper by July 11, to be included in the WST-proceedings (but this is not mandatory). | ||
+ | |||
+ | == Status == | ||
+ | |||
+ | [https://termcomp.github.io/Y2022/ Final Result] | ||
== Dates == | == Dates == |
Latest revision as of 14:09, 7 November 2022
The Termination and Complexity Competition (termCOMP) 2022 will be affiliated with IJCAR 2022 and takes part in FLoC Olympic Games 2022.
Five medals will be awarded:
- Gold, Silver, and Bronze medals go to the top three teams according to a competition-wide ranking.
- Two special medals go to the best two teams in advancing the state of the art.
The competition will be run on the StarExec platform. The first run will be shortly before FLoC, followed by a bug/conflict reporting phase and allowance for bug fixes. The final run and a presentation of the final results will be live at FLoC.
All (co-)authors of tools or benchmarks are invited (but not required) to give a 5-20min presentation of their contributions at the Workshop of Termination (WST 2022) on 12 August 2022. We ask for a title+abstract by 18 June via this submission site. Anyone who sends this in may additionally submit a 1-2 page paper by July 11, to be included in the WST-proceedings (but this is not mandatory).
Contents
Status
Dates
June 18: (Optional)Title and Abstract Submission to WST- July 24: Tools and Problems Submission Deadline
- July 25: First Run
- July 30: Conflict/Bug Report Deadline
- Aug. 4: Bugfix Deadline
- Aug. 7
5: Final Run - Aug. 9: Award Ceremony at FLoC
- Aug. 12: Presentations at WST
Awards and Competition Categories
Five FLoC Olympic Games medals will be awarded to participants.
- Gold, Silver, and Bronze medals go to the top three teams according to a competition-wide ranking. Teams are ranked by the Euclidean norms of the normalized score vector. Each component of the vector is the score of the team in a category, divided by the score of the virtual best solver (VBS) in the category. The VBS records the best (consistent) score for each claim collected at least since 2018.
- Two special medals go to the best two teams in advancing the state of the art. If a team gets higher score for a benchmark and a claim than previous year's VBS, then the team gets the difference as a special score, and teams are ranked by sums of these scores. In short, if you claim YES/NO while no tool in the past claimed so, then you get special score 1.
The competition contains several categories for termination and complexity from the areas of term rewriting (all categories with HO/FO-TRSs, with- or without strategies) and programming languages (Logic Programming, Haskell, Java, C, integer transition systems, ...)
Proposals for new categories are welcome and will be discussed in the mailing list.
A category is only run at the competition if there are at least 2 participants and at least 40 examples for this category in the underlying termination problem data base.
Competition Procedure
All participants in the same category will be run on the existing problems of this category. There might be modifications of the rules suggested by the organizer and decided by the SC.
The wall-clock timeout will be 300 seconds, and 4 cores will be available (if a tool wants to use concurrent execution). The tools will be started in their directory and obtain
- the problem file name on the command line,
- and extra info from environment variables, cf. Termination Competition 2014 technical details
The tools are expected to give an answer (YES, NO, MAYBE) in the first line on stdout, followed by a proof in ASCII, HTML, or CPF format. Exceptions to these rules are the certified (see also the CPF-website) and complexity categories. See all existing categories for more details.
For those categories devoted to prove only termination or non-termination, the score of a tool is the number of non-contradictory answers minus the given penalization (if any).
A proof or answer that is known to be wrong will be penalized by -10 points, if it remains after the bugfix deadline. The SC will decide what should be penalized, depending on the discussion among the participants.
Steering Committee
- Florian Frohn, RWTH Aachen
- Jürgen Giesl, RWTH Aachen
- Georg Moser, University of Innsbruck
- Albert Rubio, Complutense University of Madrid
- Akihisa Yamada (Chair and Organizer), AIST Tokyo Waterfront
Registration
To submit a tool, please follow the instruction at the TermCOMP web repository.
To submit a benchmark, please follow the instruction at TPDB repository.
We recommend to register early, and update information as needed. After the deadline, access to StarExec might be restricted. We need time to prepare the competition, and other competitions may be running in parallel.
It is highly recommended that participants also subscribe to the termtools mailing list, because that is where announcements will be made, and where discussion takes place.
StarExec Information
This refers to StarExec in general, and not to Termination or Complexity in particular.
- user guide
- issue tracker
- announcements and discussion (combined feed for recent messages)
- (obsolete) announcements and discussion (discontinued, but contains some information that is still valid and not available elsewhere)
Technical Details
The competition will be running on StarExec - a cross-community solver (tool) execution and benchmark (problem) library service under joint development (since 2012) at the University of Iowa and the University of Miami.
Competition data will be presented via starexec-master (a successor of star-exec-presenter developed at HTWK Leipzig).
Technical details about the execution platform (as of 2014) can be found here.
Contact
Any questions or suggestions regarding the termination competition should go to, and discussed at termtools<at>lists.rwth-aachen.de
To contact the steering committee, send an email to terminationcompetitionsc<at>lists.rwth-aachen.de.
Changes with respect to 2021
These lists reflect the discussion in the community, and in the steering committee. Items are preliminary, and not officially binding. Please do not edit this list (unless you're in the SC). Instead, send proposals to termtools or terminationcompetitionsc mailing list, or create a new wiki page and put a link here.
Proposed changes:
Adopted changes:
- Medals will be awarded.
- A competition-wide ranking scheme based on Euclidean norm.
- Awards for the two best solvers in advancing the state of the art (closing unsolved termination problems or improving known complexity bounds).
- Tool/benchmark authors are invited to submit a paper and present at WST.