Documentation

Introduction

In this document you can find a description of most of CodeJudge features (as we continously develop on the system, the system might have undocumented features that are not yet "released"). The document is intended for people who set up exercises (like professors, teachers, TAs, etc.) and not students.

Guides

Besides this reference document, we have the following guides available. These will most likely be the best place to start for newcomers:

Admin Interface

The red menu items are the admin features while the white items are for the users. While the users do not have access to the admin features, you will have access to everything on the course, even hidden content to the users on the user pages (e.g. test data, suggested solution, 'regrade'-button, etc). The content only admins can see on the user pages are often colored red. You can see exactly what is visible to the users by clicking the "see as user" link next to your login name.

Exercises

On the Exercises page normal exercises and assignments can be set up and managed. See Exercises for further details.

Users

On the Users page all the users inclusive administrators on the course are listed. The users can have a tag, that can be used to distinguish different users. Specific information about an user, including solved exercises, attempts and all submissions, can be seen by clicking on the user in the list.

Submissions

On the Submissions page a complete list of all submissions on the course is shown. The submissions can be filtered between dates or a specific submission can be found by its id. The latter is useful when an user has a question to a specific submission, since it can simply be retrieved using its id (which is also visible to the user).

Settings

On the Settings page, different configurations for the course can be set. See Settings for further details.

Exercises

On the Exercises page you can create and edit normal exercises grouped in exercise groups and assignments on CodeJudge. The normal exercises are meant to be non-mandatory training exercises, which means the users will be able to submit solutions at any time, and the best submission will determine the exercise status (solved, attempted or un-attempted). The assignment exercises are on the other hand meant to be handed in, see assignments for further details.

Exercise Group

An exercise group is a set of normal exercises. When creating/editing an exercise group the following properties can be set:

Name The name shown in the list of exercise groups (like 'Week X' etc).
Visible From The date and time from which the exercise group will be visible for the users.

Exercise

An exercise has the following properties:

Name The name shown in the list of exercises in the exercise group.
Type The type of the exercise, see exercise types.
Description A description of the exercise visible to the users. The description can be styled using html.
Languages Select the programming languages which the exercise may be solved in. Selecting "Default Languages" will allow the default language(s) of the course, see settings.
Attached Files Files such as templates, header-files can be attached to an exercise, which then will be available to the users. Make sure not to upload test data here.

Solution

The solution to the exercise can be uploaded (optional) and can serve two purposes: The solution can be used to generate the output of the test data automatically (see expected output) and/or the solution can be used as a suggested solution shown to the users. For the solution the following properties are thus available:

Solution Files Upload the solution file(s) to the exercise (as users would have to).
Show Suggested Solution Select when if/when a suggested solution should be shown to the users:
  • After exercise is solved - the suggested solution will only be visible to the user after he/she has solved the exercise.
  • Always - the suggested solution will always be visible to the users.
  • Never - the suggested solution will never be visible to the users.
Suggested Solution Type If the suggested solution is visible at some point (not set to "Never"), then the type of the suggested solution can be selected:
  • Use solution files - the uploaded solution files will be shown as the suggested solution.
  • Custom - a custom suggested solution can be written, see next option.
Suggested Solution Text The suggested solution can be written in the text area, styled using html.

Test Groups

Finally the test data should be uploaded. The tests are uploaded in test groups, which are sets of tests with the the same configuration (see configuration). An exercise can have as many test groups as you want. To on a test group in the list to edit it, or click Create New Testgroup to create a new test group. The following properties can then be set:

Name The name of the test group, visible to the user (e.g. Tests, Sample tests, etc).
Run If Select when the test group should be tested:
  • Always - the test group will always be run when a new submission is made.
  • [TG] succeeds - the test group will only be tested if the test group [TG] succeeds, i.e. all tests are passed.
  • Never - the test group will not be tested automatically. You can manually run it on submissions afterwards.
Visibility Select what should be visible to the users after submitting a solution:
  • All - All test data and judge output will be visible.
  • No test files - Only the judge output will be visible.
  • No test files or judge - Only the result (Succeeded, Wrong Answer, etc.) of each test is visible.
  • Not visible - The entire test group will not be visible.
(Overwrite) Test Data All the relevant test data to the test group should be uploaded in a single file or as a zip-file. The new test data will overwrite any existing test data. For details on the test data files, see test data.

Good Practice

For training exercise we suggest having only one test group with full feedback (visibility set to All), and execution mode to until failure (see configuration). This way the users can focus on correcting one test at a time. Also we encourage to upload a suggested solution and let it be visible after the exercise is solved, because we believe that the best way to learn is by trying yourself first, and then afterwards see the correct solution for comparison.

Exercise Types

Code Exercise

Code exercises are the most generic kind of exercises supported. Below is a table of the supported programming languages of this type, and what types of testing that are supported for each of them. For further details on the types of tests, see test data. Please contact us if you need any other language, or you need some unsupported feature.

Import/Export

Exercises and exercise groups can be imported to CodeJudge by uploading a sinlge zip-file. To import a single exercise file the uploaded zip-file should contain a folder with the exercise name and the following files int it:

NOTE: This function is very strict with the contents of the uploaded file, and no error messages helping diagnosing problems are currently available. This will soon change.

description.htm The description of the exercise.
solution.htm The suggested solution of the exercise.
exercise.xml A XML-file containing the other exercise properties (see details)
Folder(s): [Test group name] Folder(s) containing the test data. The name of the folder(s) should be the name of the test group(s). For details on how to set up the actual test data, see test data.

To import an entire exercise group, simply place all the exercise folders in a zip file and upload it. Exercises and exercise groups can also be exported in the same format as described above.

Test Data

CodeJudge supports various forms of testing methods which suits different types of exercises. A test group consists of one or more tests. A test may consist of a number of parameters: standard input (in), command line arguments (args), expected output (out), a hint (hint), a score (score), a test script (described in the following). To create a test group one must make the appropiate test files for instance files specifying the input and the expected output.

Suppose we want to create three tests for a program adding two numbers from standard input. The tests could be "5 7" with expected output "12", "1 2" with expected output "3" and "-9 9" with expected output "0". There are two ways to structure the test files: either in a single file or in many files (or a combination of both; the first is most practical when created manually and the second when generated with scripts). For the first approach the above test cases can be created by a single file "Tests.in" containing:

/// Test
5 7
/// Out: 12
/// Test
1 2
/// Out: 3
/// Test
-9 9
/// Out: 0
        

The general pattern for this approach is a file named "Tests.[parameter]" where [parameter] can be any of the types described below. In this file, each test case should be started by a line with "/// Test". Tests may also be explicitly terminated by "/// EOT" (End Of Test). All lines not within a test are considered common to all tests. By default tests created this way are named Test01, Test02, .... If you like, you can specify the name of a test by "/// Name: [name]". Finally, in any "Tests.[parameter1]" file one can write "/// [parameter2]: [value]" to set the contents of [parameter2] to [value] for a test (this is especially practical for hints and scores). (Notice specifying expected output is normally not necessary, see Expected Output for further details.)

Using the other approach, the test data could be saved in 6 different files, called Test01.in, Test01.out, Test02.in, Test02.out, Test03.in and Test03.out respectively. The general pattern here is simple; each parameter must be specified in a single file per test case with the name "[test].[parameter]".

A number of examples on how tests can be made, can be found in "Quick Start: How to create test data for my exercise?".

Configuration

Besides the parameters for each test, a test group can have some common configuration settings. A test group can be configured either directly on the CodeJudge site when uploading or by uploading a file called "config.xml" together with the rest of the test files. In the following we will describe the different parameters that can be configured:

Execution Mode "All": all tests should be run no matter of the results of the rest. "Until failure": When a test fails the remaing tests will not be run (recommended for non-mandatory exercises). "Performance": Is like "Until failure" except if a test fails due to time limit, it will not be marked as a failure but instead the test is omitted. (default is "Until failure")
Judge Type See the judges section (default is "token based")
Auto Decimal Match Should tokens that can be parsed as decimal be matched as decimals instead of literally. For instance if this is set to true "1" would match "1.0". (default is false)
Double precision Specifices as a decimal how close decimal numbers must be to the expected output (default: must be exactly equal)
Ignore case Specifices if comparisons should be case-sensitive or not. (default: false)
Extra delimiters Specifices extra characters that should be used as delimiters besides whitespaces (default: none)
Show plot If set to true, a plot of input size VS running time will be available under the test group on CodeJudge (default: false)
Time Limit If a program runs longer than the time limit allows it will be terminated. Should in general be kept as low as possible. Should be specified in milliseconds (default: 1000).
Memory Limit Limits the amount of memory a program can use. For some languages this is not very precise. (default is no limit meaning the hardware sets the limit)
Stack Limit Limits the amount of stack memory a program can use. For some languages this is not very precise. (default is no limit meaning the default Linux settings apply)

A schema for the XML file can be found here.

Expected Output (out)

When a program is tested its output is compared against the expected output and is thus a vital part of a test. The way the output is compared to the expected output is determined by the "judge" being used (see the section "Judges" for more information). Expected output can either be specified in a file in the test data, or more practically it can be generated automatically by CodeJudge. If you want CodeJudge to generate it for you, simply upload a solution before uploading your test data.

Command Line Arguments (args)

Command Line Arguments, or just arguments, can be specified. See the specifications for your language to see how to get access to command line arguments.

Standard Input (in)

Standard Input also known as console input is the most commonly used option besides test scripts.

Files in working directory (TestXX/wkdir/*)

All files and directories placed in the path /TetsXX/wkdir/ will be copied to the working directory of the users program when executed (to use this function test data must be zipped before uploading). This is useful if you want the users to learn about file access.

Test Scripts

A test script is a program written in the same language as the submission, which will be executed in combination with the submitted files. How this is done depends on the language. A test script is in many ways equivalent to a unit test. For instance, one could make an exercise where the users must implement a function average(a, b) that will return the average of a and b. In order to test it, you can upload a number of test scripts calling average(a, b) with different arguments. The easiest way to learn how to make test scripts, is to look at our samples in "Quick Start: How to create test data for my exercise?".

Java (java)

In java the test script must be a fully functional Java program, except it may call methods the users are supposed to provide. Ie. it must consist of a public class with a normal public static void main(String[] args) method. For instance:

public class Test01 {
    public static void main(String[] args) {
        System.out.println(Calculator.average(4, 9));
    }
}

If you have not uploaded a solution file before uploading your test data, you are also required to add java files in the directory dummy/ that contains the classes and methods the users are supposed to implement. It is highly recommended not using this approach, but simply uploading your solution instead. These classes/functions do not need to be functional, but they must be compileable. In other words, if you combine the test script file and these extra files, they must be able to compile. For the above example, we should add the file dummy/Calculator.java:

public class Calculator {
    public static dobule average(double a, double b) {
        return 0; // Dummy, since it must be compileable
    }
} 

Please note all of the classes provided will be stripped for package declarations. In this way, we can support users putting the classes in their own packages.

Remaining Languages

See examples.

Hints (hint)

You can add a hint to a test case which will be available to the user, if the users fails the test (currently hints are shown no matter how a test fails). Hints are supposed to be short for example "Did you consider negative numbers?" or similar.

Score (score)

A test can have an associated score. This is useful for competitions and grading purposes. The score must be a single number. Higher scores are considered better. The score of a submission is the sum of all the scores of the test cases it passes.

Size (size)

A test might be given a "size". Can be used for plotting running time VS size - useful for analyzing the assymptotic running time of a solution.

Language Support

Below you see a table of all languages currently supported on CodeJudge.

Language Arguments Input Files Test Scripts
Java 8 YES YES YES YES
C (gcc 7.2.0) YES YES YES NO
C++ (C++14, g++ 7.2.0) YES YES YES YES
C++11 (g++ 7.2.0) YES YES YES YES
C++1z (g++ 7.2.0) YES YES YES YES
C# (Mono 4.6, .NET 4.5) YES YES YES NO
F# (4.0) YES YES YES YES
Python3 (3.6) YES YES YES YES
Python2 (2.7) YES YES YES YES
Matlab (2017b) NO YES YES YES
R (3.4.2) YES YES YES YES
Bash (4.4 GNU) YES YES YES NO
Prolog (SWI-Prolog 7.4) YES YES YES YES
Rust (1.22) YES YES YES YES
Pascal (fpc 3.0) YES YES YES NO
Coq (8.5pl3) [beta] YES YES YES NO
Elixir (1.5) [beta] YES YES YES NO
Haskell (8.0.2) [beta] YES YES YES NO
Go (1.9) [beta] YES YES YES NO

* Please note Matlab support is only available if you have a special agreement with us.

Judges

A Judge is the program on the server which evaluates the output of users' programs. For each test case the output of a user's program is compared to the expected output file (usually), and if they match (the matching criteria depends on the judge), the test is passed.
CodeJudge currently supports the three following judges: Exact Judge, Token Based Judge and Custom Judge.

Exact Judge

The Exact judge checks if the output of the user's program exactly matches the output file, that including all white spaces, new lines etc. The only exception is that \r characters are ignored and lines break at the very end of the file.

This judge is expecially useful for exercises where strings including white spaces should be printed (so they match 100%), and can also be used for test scripts since the user should not be printing the output here.

Token Based Judge

The TokenBased judge also compares the output of the user's program with the expected output, but any sequence of white spaces including new lines is considered as one single white space, so how the user chooses to separate the output won't matter.

This judge is useful for input/output exercises where the users have to print the output themselves and the white spaces don't matter. This judge also has a number of configuration options (for decimal comparisons, case-sensitivity, etc)

Custom Judge

If you have a more advanced exercise where one of the above judges cannot evaluate the programs properly you can write your own custom judge. The judge can be written in any of our supported languages.

Uploading

To upload the custom judge, you must add a folder named judge/ when you upload the test fils and place the source code file of the judge in this folder. Remember to specify the properties of the custom judge in the config.xml, for further details see configuration.

Accessing data

For each test case two files will be placed in the working directory of the judge, expected (the expected output) and output (the user output) without file extensions, which can be read by the judge. If the test has an input file, the content of this file can be read from standard input by the judge.

Evaluating the program

After evaluating the program, the judge can output the following lines to standard output to save the results (only the first is mandatory):

RESULT [result] [result] must be CORRECT if the test case is accepted, otherwise WRONG. (mandatory)
TEXT [text] An optional message to the users about this single test run. For instance, if the test was passed it could be "Correct" as for the other judges, or if the test failed it could be an error message like "Saw a but expected b".
SCORE [score] A score can indicate how well they passed the test case. Can be used for optimization problems.
FILE [file] Specifies that [file] should be copied from the working directory, and will be shown together with the results of the test in CodeJudge.

If the judge program terminates with an error, the system will mark it as a system error.

Templates/Examples

A few templates and examples of custom judges will be available here later.

Execution Environment

All code submitted to our system will be evaluated on a 64-bit Linux system in a secured environment. Among other things, this prevent programs from access the internet during execution. Similarly file access is restricted such that only a limited subset of files can be edited on the system.

Assignments

An assignment consists of a set of exercises like an exercise group, but since the solutions to these exercises are to be handed in, the exercises have a due-date. Here the last submission to each exercise before the due date will count as the solution handed in. If by mistake an user submits a wrong solution as the last, the users other submissions can still be seen in the list of all submissions.

Creating an Assignment

An assignment have the following properties.

Name The name of the assignment.
Visible From The date and time from which the assignment will be visible for the users.
Due Date The due date for the assignment.
Post Submissions Selects whether or not the users can submit solutions after the due date. These submissions will not count as handed in, but can be used after feedback has been given to try and correct possible mistakes.
Groups Selects whether or not group hand in is allowd.
Description A description of the overall assignment. Each exercise still has their own description.

The exercises of an assignment are created exactly as training exercises, see creating an exercise. Note that a suggested solution to assignment exercises will never be visible to the users, only to you.

Good Practice

For assignments we recommend to always have a test group called 'Sample Tests' (or similar) consisting of a few small test cases, with full feedback, and then have another test group with all the hard tests, which either is invisible or have no feedback, depending on what suits the course best. The sample tests will help the users correcting very stupid mistakes in their handed in solution, but they will still not know if it is completely correct.

Overview of Hand-ins

From the exercise page a hand-in overview page can be opened- On this page all users/groups are listed, and for each group/user it can be seen what exercises they solved or attempted. By clicking on one user/group a hand in page is shown, where the hand in submissions can be opened. On this page a possible comment from the users is also shown, and feedback can be written to the user/group.

Plagiarism

After the due date has passed, a simple plagiarism check is run on all handed in solution. The plagiarism check compares the solutions two by two, and find identical sequences (variable names are all considered identical and white spaces are ignored), all solutions are then ranked by the amount of code that is identical to another solution. An overview of all solutions plagiarism score to an exercise can be opened from the hand in overview page. By clicking on a user/group here, it can be seen exactly what code was identical to another solution.

Settings

In the settings page you can set some general parameters for your course, add pages and a faq.

Configurations

Here you can select what pages of 'Exercises' and 'Assignments' you want in the user menu, and if it should be visible to the users how many who has solved an exercise. Finally you can select the default programming languages of the course, which then will be used for all exercises where 'default languages' is chosen.

FAQ

You can add a FAQ to your course, which will be shown in the 'Help' page. To add a FAQ entry you just need to write the following.

Title The title of the FAQ entry (usually a question), which will be shown in the list of entries in the 'Help' page.
Description The description of the entry (usually an answer to the question), which will be shown when an user clicks on the entry.

Pages

You can add extra pages to the user menu. When adding a paging you have the following properties to set:

Title The title of the page shown just below the menu when the page is opened.
Menu The name of the page in the user menu.
Url The name of the page in the url. The url will thus be codejudge.net/[COURSE NAME]/page/[Url]
Requires login: Select if an user should be logged in to the course to view the page.
Content The content of the page using html. Use <div class="area"></div> to create a white area similar for the other pages.

Help

If you have any questions regarding CodeJudge, you get some unexpected errors or you have any suggestions for improvements, please feel free to contact us. You can normally expect answers within 24 hours.

E-mail: support@codejudge.net

CodeJudge.net - Automated Code Judging
26-04-2018 18:46