SC18 Papers Submissions Open Today with New Review Process

Print Friendly, PDF & Email

In this special guest feature, SC18 Papers Chair Torsten Hoefler from ETH Zurich writes about big changes to the conference papers program. It’s timely news as SC18 Paper Submissions open today.

Torsten Hoefler from ETH Zurich is SC18 Papers Chair.

What many of us know can also be shown with numbers: The SC conference is the most prestigious conference in High Performance Computing (HPC). It is ranked 6th in the “Computing Systems” Category in Google Scholar’s Metrics (with H-index 47 on January 21st 2018). It is only topped by TPDS, FGCS, NSDI, ISCA, and ASPLOS and thus the highest ranked HPC conference! The next one is arguably PPoPP with H-index 37 and rank 20.

The SC conference routinely attracts more than 10,000 attendees and nearly 50% indicated in a representative survey that attending technical presentations is within their top three activities. This makes it definitively the HPC conference where speakers reach the largest audience. I speak from experience: my SC17 talk had an audience of more than 400 people and its twitter announcement quickly surpassed 10,000 views. So, it is the conference where big things start.

New Enhancements in ‘18

Todd Gamblin from LLNL is the SC18 Papers Vice Chair.

This year, I am honored to be SC18′s Papers Chair, with the enormous help of my Vice Chair Todd Gamblin from LLNL. In light of this year’s conference theme of “HPC Inspires” and to make this great conference even greater, we have some major enhancements to the submission process planned.

In addition to rebuttals, we introduce two types of revisions during the submission. This allows the authors to address reviewer issues right within the paper draft while they may also add new data to support their discoveries. Rebuttals are possible but will become less important since misunderstandings can be clarified directly in the draft.

Whether the paper is accepted or rejected, the authors will have an improved version. The revision process leads to an increased interaction between the committee and the authors, which eventually will increase the quality of the publications and talks at the conference. The overall process could be described as an attempt to merge the best parts of the journal review process (expert reviewers and revisions) with the conference review process (fixed schedule and quick turnaround).

A Bit of Background

This process has been tested and introduced to the HPC field by David Keyes and myself at the ACM PASC 2016 conference in Switzerland. We were inspired by top-class architecture and database conferences and adopted their process to the HPC community. The established PASC review process motivated the addition of revisions for IPDPS 2018 (through the advocacy of Marc Snir). Now, we introduce similar improvements scaled to the SC Conference Series.

The key innovations of the PASC review process were (1) no standing committee (the committee was established by the chairs based on the submissions, similar to a journal); (2) fully double-blind reviews (not even the TPC chairs knew the identity of the authors); (3) short revisions of papers (the authors could submit revised manuscripts with highlighted changes), and (4) expert reviewers (the original reviewers were asked to suggest experts in the topic for a second round of reviews). The results are documented in a presentation and a paper.

My personal highlight was a paper in my area that improved its ranking drastically from the first to the second review because it was largely rewritten during the revision process. In general, the revision seemed highly effective as the statistics show: of the 105 first reviews, 19 improved their score by one point, and two improved by two points in the second review.

Points ranged from 1 (strong reject) to 5 (strong accept). These changes show how revisions improved many reviewers’ opinions of papers and turned good papers into great papers. Revisions even enabled the relatively high acceptance rate of 27% without compromising quality. Expert reviews also had a significant effect, which is analyzed in detail in the paper.

SC has a long history of fairly massive submissions which requires a larger committee with a fixed structure spanning many areas. Also, the conference is aligned to a traditional schedule. All this allows us to only adopt a part of the changes successfully tested at PASC. Luckily, double-blind reviews were already introduced in 2016 and 78% of the attendee survey preferred it over non-double blind. Thus, we can focus our attention on introducing the revision process as well as the consideration of expert reviews.

Dates, Deadlines and Revision Details

Adopting the revision process to SC was not a simple task as schedules are set years in advance. For example, the deadline cannot be moved earlier than the end of March due to the necessary coordination with other top-class conferences such as ACM HPDC and ACM ICS (which is already tight, but doable this year). We will also NOT grant the “traditional” one-week extension. Let me repeat: there will be NO EXTENSIONS this year (like in many other top-class CS conferences).

Furthermore, the TPC meeting has already been scheduled for the beginning of June and could not be moved for administrative reasons. Most of the decisions must be made during that in-person TPC meeting. We will also have to stay within the traditional acceptance rates of SC. We conclude that significant positive changes are possible within these limited options.

To fit the revision process into the SC schedule, we allow authors to submit a “featherweight” revision two weeks after receiving the initial reviews. This is a bit more time than for the rebuttal but may not be enough for a full revision. However, the authors are free to prepare it before receiving reviews. Even in the case of a later rejection, we believe that improving a paper is useful. Each featherweight revision should be clearly marked up with the changes (staying within the page limit). The detailed technology is left to the authors. In addition, the limited-length rebuttal could be used to discuss the changes.

The authors need to keep in mind that the reviewers will have *very little* time (less than one week before the TPC meeting) to review the featherweight revision. In fact, they will have barely more time than for reviewing a rebuttal. So, the more obvious the changes are marked and presented, the better are the chances for a reconsideration by the committee.

Furthermore, due to these unfortunate time limitations, a second round of reviews is not possible for the featherweight revision (reviewers are free to amend reviews, but we cannot require them to do so). Nevertheless, we strongly believe that all authors can use this new freedom to improve their papers significantly. We are also trying to provide feedback on the paper’s relative ranking to the authors if the system allows.

Accept, Minor Revision, Major Revision, or Reject Categories

During the in-person TPC meeting, the track chairs will moderate the discussion of each paper and rank each in one of the following categories: Accept, Minor Revision, Major Revision, or Reject. An accepted paper is deemed suitable for direct publication in the SC proceedings. We expect the top 3-5% of the submitted papers to fall into that category. A Minor Revision is similar to a shepherded paper and is accepted with minor amendments, pending a final review of the shepherd. We expect approximately 10% of the submitted papers to fall into this category. This higher-than-traditional number of shepherded papers is consistent with top conferences in adjacent fields such as OSDI, NSDI, SOSP, SIGMOD etc.

The new grade is Major Revision, which invites the authors to submit a majorly changed paper within one month. A major revision typically requires additional results or analyses. We expect no more than 10% of the initial submissions to fall in this category. Of those, about 5% will be finally accepted (depending on the final quality). Major revision papers will be re-reviewed and a final decision will be made during an online TPC discussion, moderated by the respective track chair. Finally, Rejected papers at any stage will not appear in the SC proceedings.

Regarding expert reviews, we may invite additional reviewers during any stage of the process. Thus, we ask authors to specify all strong conflicts (even people outside the current committee) during the initial submission. Furthermore, we will have reviewers evaluate other reviewers’ feedback in hopes of improving the quality of the process in the long run.

At the end of this discussion, let me place a shameless plug for efforts to improve performance interpretability 🙂 We hope that the state of performance reporting can be improved at SC18. While many submissions use excellent scientific methods for evaluating performance on parallel computing systems, some can be improved following very simple rules. I attempted to formalize a set of basic rules for performance reporting in the SC15 State-of-the-Practice paper “Scientific Benchmarking of Parallel Computing Systems”. I invite all authors to follow these rules to improve their submissions to any conference (they are of course NOT a prerequisite for SC18 …. but generally useful 😉 ).

In the light of this year’s “HPC Inspires” theme, we are looking forward to working with the technical papers team to make SC18 the best technical program ever and consolidate the leading position of the SC Conference Series in the field of HPC. Please let me or Todd know if you have any comments or suggestions. Make sure to submit your best work before March 28 and help us to make SC18 have the strongest paper track ever!

Final Note

I want to especially thank David Keyes for advice and help during PASC’16 and Todd Gamblin for the great support for the organization of SC18. In addition, Bronis de Supinsky provided greatideas regarding the adoption of the PASC process to the SC18 conference. Most thanks goes to the track chairs and vice chairs that will support the implementation of the process during the SC18 paper selection process (in the order of the tracks): Aydin Buluc, Maryam Mehri Dehnavi, Erik Draeger, Allison Baker, Si Hammond, Madeleine Glick, Lavanya Ramakrishnan, Ioan Raicu, Rob Ross, Kelly Gaither, Felix Wolf, Laura Carrington, Pat McCormick, Naoya Maruyama, Bronis de Supinski, Ashley Barker, Ron Brightwell, and Rosa Badia. And last, but not least, the 200+ reviewers of the SC18 technical papers program!

SC18 Papers Vice Chair Todd Gamblin from LLNL also contributed to this article.

Papers Submissions are now open for SC18, which takes place Nov. 11-16 in Dallas.

Check out our insideHPC Events Calendar