Artifact Evaluation

FSE 2025 aims to support open and reproducible research within the field of cryptography. As such, authors of papers accepted to FSE 2025 are invited to submit artifacts associated with their papers, such as software or datasets, for review, in a collaborative process between authors and the artifact review committee.

IACR FSE Artifact Badges

Authors can choose to have their artifacts evaluated by the FSE Artifact Evaluation Committee (AEC) against three badges: Artifacts Available, Artifacts Functional, and Artifacts Reproduced. Each evaluation is optional. This system broadly follows the conventions established in recent years in security research conferences such as USENIX Security and NDSS.

IACR FSE Artifacts Available: To earn this badge, the AEC must judge that artifacts associated with the paper have been made available for retrieval. Other than making the artifacts available, this badge does not mandate any further requirements on functionality, correctness, or documentation. This is intended for authors who simply wish to make some supplementary material available that supports their paper. Examples include data sets, large appendices, and other documentation.
IACR FSE Artifacts Functional: To earn this badge, the AEC must judge that the artifacts conform to the expectations set by the paper in terms of functionality, usability, and relevance. The AEC will consider four aspects of the artifacts in particular.
- Documentation: are the artifacts sufficiently documented to enable them to be exercised by readers of the paper?
- Completeness: do the submitted artifacts include all of the key components described in the paper?
- Exercisability: do the submitted artifacts include the scripts and data needed to run the experiments described in the paper, and can the software be successfully executed?
- Reusability: means that the artifacts are not just functional but of sufficient quality that they could be extended and reused by others.

IACR FSE Results Reproduced: To earn this badge, the AEC must judge that they can use the submitted artifacts to obtain the main results presented in the paper. In short, is it possible for the AEC to independently repeat the experiments and obtain results that support the main claims made by the paper? The goal of this effort is not to reproduce the results exactly but instead to generate results independently within an allowed tolerance such that the main claims of the paper are validated.

Scope and Aims

The goal of the process is not just to evaluate artifacts but also to improve them. Artifacts that pass successfully through the artifact review process will be archived alongside the paper on the website of the Transactions on Symmetric Cryptology (ToSC) and at https://artifacts.iacr.org/.

Examples of this in the field of cryptography include:

Software implementations (performance, formal verification, etc.): The source code of the implementation; a list of all dependencies required; the test harness; instructions on how to build and run the software and the test harness; a description of the platform on which the results in the paper were obtained; and instructions or scripts to process the output of the test harness into appropriate summary statistics.

Data or other non-code artifacts: Documents or reports in a widely used non-proprietary format, such as PDF, ODF, HTML, text; data in machine-readable format such as CSV, JSON, XML, with appropriate metadata describing the schema; scripts used to process the data into summary form. Where non-standard data formats cannot be avoided, authors should include suitable viewing software.

Where possible, such as in software-based artifacts relying solely on open-source components, the artifact review process will aim to run the artifact and test harness, and see that it produces outputs that would be required to assess the artifact against results in the paper. For artifacts that depend on commercial tools, the goal of the artifact review process will be to confirm that the artifacts are functional (should the submitters wish to be evaluated for functionality) and could plausibly be used by someone with access to the appropriate tools to reproduce the results.


Timeline and Process

The artifact review process begins after the paper has been accepted for publication in ToSC. Only papers accepted to FSE 2025 will be considered under the artifact review process.

Following notification of acceptance (or acceptance with minor revisions) to FSE 2025, the artifact may be submitted for review up to the next artifact submission deadline.

Artifact Submission Deadlines

For papers accepted to ToSC 2024-2: TBD
For papers accepted to ToSC 2024-3: 11 September 2024 (Wednesday)
For papers accepted to ToSC 2024-4: 12 December 2024 (Thursday)
For papers accepted to ToSC 2025-1:   6 March 2025 (Thursday)

Once the artifact is submitted, two members of the artifact review committee will be assigned to review the artifact. The artifact review process will be a continuous process and may involve requests from the reviewers for additional help on how to run the artifact, interpret its results, etc. It is acceptable (and expected) that the interaction between the reviewers and the authors leads to the artifact being updated during the review process. Updates that affect scientific characteristics reported in the paper (such as changes to performance) should be clearly documented.

We aim for the artifact review process to be completed within 8 weeks of the artifact being submitted, but this will vary depending on the scale of the artifact and the timeliness of interaction between the authors and reviewers. Authors of artifacts that are accepted for archiving will be provided instructions on how to submit the archival version of their artifacts.

We ask for authors to be understanding and to join us in viewing this as a collaborative process trying to produce better artifacts for the scientific community.

Confidentiality

The artifact review process will be single-blinded: the authors of the paper and artifact are not anonymous, but the reviewers will be anonymous. Communication between the authors and the reviewers will be facilitated via the HotCRP review site. Authors should not attempt to learn the identities of the reviewers, for example, by not embedding analytics or tracking elements in the artifact or a website. Note that hosting on a personal website should be avoided, as it makes easier to have logs and check who accessed the file. If you cannot comply with this for some reason out of your control, please notify the chairs immediately to discuss.

Conflict of Interest

The FSE 2025 artifact review process follows the same conflict of interest policy as ToSC, which is the IACR policy with respect to conflicts of interest. A conflict of interest is considered to occur automatically whenever an author of a submitted paper and a reviewer

were advisee/advisor at any time,
have been affiliated to the same institution in the past 2 years,
have published 2 or more jointly authored papers in the past 3 years, or
are immediate family members.

Conflicts may also arise for reasons other than those just listed. Examples include closely related technical work, cooperation in the form of joint projects or grant applications, business relationships, close personal friendships, instances of personal enmity. For more information please see the IACR Policy on Conflicts of Interest. Authors will be asked to identify conflicts of interest with the committee members at time of artifact registration.

Copyright and Licensing Conditions

In order for the IACR to distribute artifacts, we require permission to do so. You are asked to grant the IACR permission to do so under an open-source license of your choice, such as an OSI-approved license. As some artifacts may combine portions created by you and third-party materials obtained elsewhere, you must ensure that you have obtained a license to redistribute all third-party materials included in the artifact that were not created by you, for example by including only open-source components or by otherwise obtaining and demonstrating the required permission.

It is not a requirement that any patent rights be granted.

Submission Instructions and Format

Artifacts shall be registered and submitted via the IACR HotCRP artifact server: (TO BE ANNOUNCED)

A submission shall include:

- The title and abstract of the accepted paper
- The authors of the accepted paper and their affiliations
- Email addresses for the contact authors for artifact
- The PDF of the submitted paper, or an updated/camera-ready version, if available
- A brief description of the artifact
- If the artifact is less than 20MB: a .zip or .tar.gz containing the artifact
- If the artifact is larger than 20MB: instructions on how to obtain the artifact
- A link to a GitHub repository or similar for the artifact, if available, along with the commit/tag of the submission

The artifact itself shall include at least the following files:

LICENSE: The license(s) under which the artifact is released
README: The main starting point for anyone attempting to use the artifact. It should include instructions on:
Dependencies required to build and run the artifact, including specific version numbers of dependencies
Instructions for building and running the artifact
Options on configuring the artifact to run in different modes, if applicable
Instructions on how to interpret the output of the artifact, including which scripts to run if appropriate
An explanation of how the source code is organized

Files such as LICENSE and README can be plain text files or Markdown files.

Source code files within the artifact are encouraged to be organized, formatted, and documented using best practices and conventions appropriate to the programming language in question. For example, formatted using a consistent style such as PEP8 for Python; documentation of APIs using JavaDoc for Java or Doxygen for C; unit tests using an appropriate framework.


Packaging of the Artifact

The primary form of the artifact should be as source code, with suitable build scripts and instructions on how to install the appropriate dependencies.

For artifacts with complex dependencies or build requirements, the authors are encouraged to also package the artifact in the manner that makes it most amenable to successful execution. Potential formats include:

A virtual machine image (Virtualbox, Docker,…) containing the artifact and all dependencies already installed, and the artifact compiled, configured, and ready to run. It is preferable to also include the Dockerfile or script used to create the image if possible.
A binary installable package, such as .rpm or .deb package on Linux, or an MSI Installer on Windows.
A video demonstrating the use of the artifact and the results, especially in the case of an artifact that requires commercial software, specialized hardware, or long computation times.
A "live notebook" (Jupyter, Sage,...) for demonstrating a sequence of mathematical calculations, especially of data artifacts.

When in doubt, imagine a first-year grad student in 2029 who is told by their supervisor "See if you can change this artifact from FSE 2025 to do X." We want to give them the best chance of success with the least amount of pain.

FSE 2025 Artifact Evaluation Committee (AEC)

Elena Andreeva (TU Wien)

Sébastien Duval (University of Lorraine)

David Gerault (Technology Innovation Institute)

Hosein Hadipour (TU Graz)

Danping Shi (Chinese Academy of Science)

Ling Sun (Shandong University)


Artifact Review Chair

Patrick Derbez (University of Rennes)