•  
  •  
 

Journal of Hate Studies

Annual Review Issue

Journal of Hate Studies (JHS) welcomes suggestions for books, films, exhibits, and performances to review for our annual issue of collected reviews. Authors, editors, filmmakers, curators, and performers are welcome to submit their work for review, and we also encourage readers to nominate items for review. Contact the editor with your suggestions.

Likewise, we seek reviewers who will offer thoughtful reviews, typically of 800-1200 words, of such works, drawing from their own areas of expertise to critically engage with books, films, exhibits, and performances that engage and expand hate studies scholarship. To volunteer as a reviewer, please contact the editor.

JHS values unique perspectives and contributions and therefore prohibits the use of generative AI for authors, editors, and reviewers. Those authoring reviews are prohibited from using AI tools to analyze books, films, exhibits, performances, etc. Those who create such work trust reviewers, vulnerably sharing hard and often very personal work, and we return that trust by offering only thoughtful, engaged, authentic reviews written by human readers/viewers.

Failure to disclose the use of generative AI may result in retractions of publications, among other corrective actions, even if the content published does not contain errors or plagiarism caused by AI use.

Review writers who find themselves struggling to write without the use of generative AI are encouraged to contact the managing editor or editorial board chair for support. JHS’s larger mission includes increasing scholarly interest and competency in hate studies, and supporting writers is part of that work.

Generative v. Assistive AI

Assistive AI tools are permitted in manuscripts and reviews of manuscripts, and their use does not have to be disclosed. These are tools that merely address lower-level technical tasks, such as identifying spelling errors and typos, proposing diction choices and syntax changes to improve readability, and formatting bibliographies of works the author has read. They do not propose ideas, locate sources, summarize key points in readings, or generate text based on prompts. When you create content and use a digital tool to refine it, you are using assistive AI. Examples include spell check and citation management systems. However, authors alone are responsible for the content of their publications and should ensure that even permissible assistive AI does not introduce errors into their work.

Generative AI tools produce content, including outlines and summaries of existing texts, manuscripts, images, charts and graphs, and translations. Manuscripts with their origins in generative AI content—such as articles written based on an AI-generated outline or articles that are built from heavily-revised AI-generated content—must be disclosed if you submit them to JHS, and it is unlikely that they will be published. Example tools include ChatGPT, Claude, and Copilot.