How might we improve the peer review experience?

How might we improve the peer review experience?
The Publisherspeak peer review survey 2022 results

In 2022, we wanted to take a closer look at one of the pillars that are central to the scholarly research process: peer review. Our goal with the survey, How might we improve the peer review experience? was to understand the unique perspectives and challenges of authors, reviewers, editors, and publishing team members. In this survey, we looked at peer review from two angles: one, the challenges faced by different stakeholders, and two, the gaps in the current processes.

Now, the stakeholders in peer review have spoken, and we are excited to share with you the results of the survey and the insights we gained from different participants in the peer review process.

Survey methodology and background information

We published our peer review survey in March 2022. The survey included questions about the role(s) played by the recipient in the peer review process, their typical challenges, the improvements they would like to see in the peer review process and platforms, and more. This survey was shared via email campaigns and posts on social media

People playing different roles in the peer review process participated in this survey over a period of roughly 3 weeks. 34% of respondents were from Europe, 24% from North America, 24% from South America, and the remaining 18% from Asia.

Authors made up 16% of the respondents, while 17% were reviewers, 28% were editors, and 39% were publishing team members. Several respondents chose multiple roles, with 7.7% of respondents choosing all four roles – author, reviewer, editor, and publishing team member.

The authors’ perspective

Starting with the central figure of the publishing process, we asked authors what their typical challenges and frustrations were while submitting manuscripts. The 3 most popular responses were having to use multiple tools to get their manuscripts ready for submission, understanding and adhering to submission guidelines, and ensuring that their manuscript language quality is up to the mark.

75% of the respondents that chose manuscript language quality as one of the challenges were from Europe and Asia. This demonstrates the importance of multilingual support in journal publishing to make the ecosystem more equitable and accessible to non-English speaking authors and readers.

Other notable challenges include ensuring that the author’s critical data and assets are validated properly and having to arrange references as per journal guidelines.

When asked about the improvements that they would like to see in the peer review process, here are the top 3 improvements the authors picked:

  1. Better visibility & updates of the review process
  2. Ease of resubmission of revised manuscript
  3. Structured and clear feedback reports and revision requests

A few authors also mentioned that they would like to have a smaller time gap between submission and peer review and the ability to manage all their submissions in one place.

The reviewers’ POV

Reviewers are critical stakeholders in the peer review process; they help uphold the integrity of published research as external referees with their voluntary services and contributions. According to the peer reviewers that took our survey, the top problems they face during the peer review process include:

  1. Confirming whether manuscript revisions have addressed their review comments
  2. Giving structured, contextual, and constructive feedback
  3. The lack of quality in data, assets, and language in submitted manuscripts

A few reviewers also highlighted issues with accessing and using peer review platforms, managing and tracking multiple manuscripts for review, and the lack of clarity in review guidelines and parameters.

We also asked reviewers what would be the one critical factor that would improve the quality of their peer review experience. Reviewers felt that validation of submission data, assertions in the manuscript, references, etc., structured feedback forms and templates to add review comments, and ease of use of the peer review system are key to a successful peer review process.

While structured feedback forms and templates were one of the top choices of reviewers, a small minority shared a different perspective and argued that it may be better not to have any feedback forms as some forms provided by the journals cover a multitude of unimportant issues, which detracts attention away from important issues. 

The editors’ experience

Editors help guide the peer review process and ensure that it is fair, timely, and rigorous. We asked editors about the challenges they face during peer review, and here are their top responses:

  1. Finding suitable peer reviewers
  2. Managing their peer review workload – assigning reviewers, tracking, and following up on review status
  3. The lack of proper reports

Other popular responses include ensuring a fair and unbiased review process, capturing and sharing reviewers’ feedback in an organized manner, and the lack of configurability in setting up different peer review workflows. A few editors also revealed that it is often difficult to ensure that reviews are completed on time and deadlines are met because reviewers are often busy with their own research.

On the subject of peer review models, we asked editors which model they foresee as being the most beneficial – single-blinded, double-blinded, transferable, collaborative, post-publication, open peer review, or any other model(s). 

69% of editors chose double-blinded peer review as the most beneficial model. When asked why, the main reasons that emerged were:

  1. Double-blinded review gives the researcher the freedom to review papers even from their close colleagues, making observations that they may not do in person. So it is more fair.
  2. Double-blinded review allows for less bias.
  3. While the trend in other areas is to make everything collaborative and open, there is a lot of value in not knowing names and keeping this in reserve to avoid “bad practices.”

The second most popular option, chosen by 19% of respondents, was open peer review, followed by collaborative and single-blinded peer review (6% each). 

The publishing teams’ outlook

Publishing teams play the role of overseeing the peer review process and enabling the smooth dissemination of research. A peer review system is a critical tool that enables publishing teams to do so. Here are the most popular factors publishing team members consider while making the decision to adopt a peer review system:

  1. Affordable pricing (eg. pay-per-use pricing, low setup fees, etc.)
  2. Seamless integration with industry tools and databases
  3. Seamless integration with production systems
  4. Flexible blinding options and customizable workflows

Other notable factors include plagiarism detection, actionable reports, adoption of AI/ML and blockchain technology, and multilingual support.

Here are the most common challenges that publishing team members said they face during the peer review process:

  1. Finding suitable peer reviewers
  2. Having visibility over and managing the peer review workflow
  3. The lack of configurable workflows (eg. blinding, open peer review, etc.)
  4. The ability to seamlessly cascade to other journals
  5. Portability of metadata into production systems after acceptance
  6. Capturing and sharing reviewers’ feedback in an organized manner

Publishing team members also mentioned challenges around long review cycles, duplicate submissions, and the lack of integrations of peer review platforms with third-party systems like ORCID, Crossref Funder Registry, etc.

Peer review is essential to research; this process plays the critical role of upholding research integrity. It has come a long way in the last few decades, but there is much scope for improving the process to make it a smoother and more rewarding experience for everyone involved. This calls for the research community to collaborate as a cohesive unit for better execution of peer review that will ultimately help humanity address the challenges we face.

What’s the focus for journal publishing in 2022?

journal publishing in 2022

What’s the focus for journal
publishing in 2022?

Open access challenges

We kicked off this year with the Publisherspeak Journals Think Tank, a forum of journal publishing professionals that meets regularly to discuss challenges and brainstorm ideas. We had our very first meeting in January, where we discussed several priority topics.

The publishing landscape changed significantly during the pandemic. What’s the focus for journal publishing in 2022? The panel answers.

Collaborative research

As we discussed in a previous post, collaborative research was greatly affected by the pandemic. With borders being closed off in most countries, researchers were unable to travel and work together. The pandemic also affected the conferences that previously created opportunities for researchers to interact and collaborate. In 2022, the panel hoped that the world would open up in a stable way with travel restrictions being lifted globally. This will hopefully make way for collaborative research to restart.

A cohesive publishing lifecycle

A vast amount of data is already available, but in the current scenario, this is a lack of cohesive links between the different data sources. Linking metadata from one stage of the publishing lifecycle to the next ought to be a key priority. But for this to happen, accurate metadata has to flow throughout the system. The panel discussed two solutions:

  • Reducing instances of manual intervention can help streamline the lifecycle.
  • Persistent identifiers, such as grant DOIs from funders, can provide reliable and consistent metadata without authors having to rekey information.

Researcher experience

Simplifying author experience throughout the publishing process is paramount. Currently, reformatting manuscripts to satisfy the criteria of different journals is a significant challenge. Furthermore, extensive style guides can hinder efficiency. The panel discussed the scope of optimizing style guides across journals. With the resources already available, including PIDs like the ORCID iD, it is possible to enable authors to submit information without having to rekey all of their details for every submission. 

As this discussion progressed, a new question arose: In the quest to improve author experience, do the reviewer and editor experiences get overlooked?

This led to a consensus that a holistic approach that focuses on improving the overall researcher experience is the need of the hour. Adjustments ought to be made to the current system to improve the author, reviewer, and editor experiences as a whole. This improvement can be facilitated by intuitive and comprehensive platforms.

Stay tuned for the next installment of the Journals Think Tank!

About the Publisherspeak Think Tank

The publishing landscape is evolving rapidly with time, giving rise to exciting developments and new challenges. The Publisherspeak Think Tank brings together experts from diverse areas of the publishing ecosystem, to share their experiences and insights to adapt to challenges and adopt industry trends.

Open Access challenges: From adoption to operation

Open access challenges

Open Access challenges: From adoption to operation

Open access challenges

We kicked off this year with the Publisherspeak Journals Think Tank, a forum of journal publishing professionals that meets regularly to discuss challenges and brainstorm ideas. We had our very first meeting in January, where we discussed several priority topics.

The open access (OA) movement has been in motion for decades now, but it gained significant momentum in the last two years—the COVID-19 pandemic raised the call for open data and unrestricted access to research, Plan S came into effect in 2021, cOAlition S launched more initiatives to advance the movement, Open Access Week 2021 took place with the theme “It Matters How We Open Knowledge: Building Structural Equity,” and more. 

Despite the growth of OA, there are still many pressing challenges that hinder progress.

Adoption hesitancy

While the OA movement has seen much success in many areas, some disciplines remain hesitant to adopt this publishing model. The panel debated the need for versatile models for different scenarios—there is no “one size fits all” model that works for all publishers, across all disciplines. However, the resulting emergence of a large number of OA options in itself presents challenges, especially for young journals.

On potential solutions for this challenge, the panel concurred that a hybrid publishing model can be a good starting point for established journals to dip their toe into the waters of the OA movement, but that launching new journals under an OA model typically makes more sense in the current market. It is important to ensure that journal integrity and quality are upheld in finding a balance between publishing costs and content affordability.

Metadata reliability

In the digital age, there is an abundance of metadata. But is this data being put to the best use? Publishers deal with a vast amount of author and article metadata across different sources, including submission systems, production systems, and hosting platforms. The panel discussed the challenge of having to match all this information to ensure consistency and reliability throughout the publishing lifecycle and beyond.

Uniform flow of metadata from submission to publication is the key to data integrity. This calls for a single system that would prevent siloing of data and simplify its flow from one stage to the next, with links out to essential persistent identifiers like ORCID.

The elusive sweet spot

A hard truth is that the need to secure a publication’s future can sometimes be another challenge in adopting OA. It is especially exigent to identify the “sweet spot” where a publication can make its article charges affordable and simultaneously cover its costs and have the surplus to grow its program in the future.

From a publisher’s point of view, submission fees and publication fees are good revenue streams that also serve the benefit of filtering out unqualified submissions. And from a reader’s perspective, online OA models are desirable because these models provide ample opportunities for receiving citations and recognitions for articles.

But the article processing charge (APC) route may not be the right–and the most inclusive–way to go. Funders and publishers need to consider scenarios where researchers and institutions are not as well funded. Considering this, an OA model that shifts the burden of payment away from both authors and readers seems more favorable.

Furthermore, the panel also discussed national grants and government initiatives and how these can go a long way in bolstering the OA movement. For instance, RedALyC is an OA digital library supported by Universidad Autónoma del Estado de México, a public Mexican university along with other institutions. SciELO is an OA journal database that covers Latin America, the Iberian Peninsula, and South Africa.

The panel agreed that as the OA movement continues to expand, a concerted collaborative effort from publishers, funders, researchers, and governments is the need of the hour.

Stay tuned for the next installment of the Journals Think Tank!

About the Publisherspeak Think Tank

The publishing landscape is evolving rapidly with time, giving rise to exciting developments and new challenges. The Publisherspeak Think Tank brings together experts from diverse areas of the publishing ecosystem, to share their experiences and insights to adapt to challenges and adopt industry trends.

Publishing amidst a pandemic

Publishing amidst pandemic

Publishing amidst a pandemic

Publishing amidst pandemic

We kicked off this year with the Publisherspeak Journals Think Tank, a forum of journal publishing professionals that meets regularly to discuss challenges and brainstorm ideas. We had our very first meeting in January, where we discussed several priority topics.

Publishing, much like most industries, has been deeply impacted by the global COVID-19 pandemic. How has the pandemic and the shift to remote work affected research, publishing productivity, and team morale? What does the future of publishing look like?

Research during the pandemic

The panel described the two conflicting effects of the pandemic on research:

Many researchers were able to work on data analysis and writing during the pandemic as they had more time to write and analyze their work. This led to a healthy flow of manuscript submissions. Many were also able to work on grant proposals to set themselves up for future research.

Conversely, those involved in disciplines that require field research were restricted due to global lockdowns, and as a result, manuscript submissions in these fields have tapered. Ph.D. candidates who had planned to commence field research in 2020 never got the chance to do so, which has negatively affected their morale and productivity. Collaborative research has also been deeply impacted by such restrictions.

With the world cautiously reopening, the panel noted a shift back towards research.

The WFH effect – Team morale and productivity

Publishers who worked with remote teams before the pandemic were able to transition to remote work without much difficulty. Teams that adopted remote work for the first time, however, witnessed resistance from some members during the initial stages of the change. Over time, not only was this hesitation removed, most employees actively embraced the remote work model.

The panel agreed that remote work paved the way for versatile hiring decisions, enabling publishers to hire from a wider pool of candidates and offer flexible working options. Adopting remote work has led to cost savings in some instances leading to the development of smaller, smarter workspaces. 

On the flip side, workforce morale has been affected by the WFH model. The following tips were shared and discussed by the panelists as ways to address this challenge:

  • Establish clear KPIs and metrics for getting the work done.
  • Set up channels for frequent communication.
  • Build trust among teams.
  • Make room for virtual social catch-up meetings.
  • Be mindful of following up with employees who are struggling with staying productive as well as those who are overworking themselves. 

The panel discussed the importance of staying on top of team goals and tasks to create a productive remote work model.

The way forward

A change in the publishing model was needed in order to react to the vast volume of content that came in and to prioritize COVID research. The panelists concurred that this change has made publishing more productive.

The world has embraced virtual meetings and remote work. On the one hand, virtual meetings have proven useful for editorial board meetings that require members from different journals to come together. On the other hand, organizing a virtual meeting for geographically dispersed participants can be complex due to time zone variances. Despite its shortcomings, virtual meetings have become the new normal. This begs the question: Will we ever return to fully in-person settings? 

The panel agreed that remote work has been very beneficial. Working virtually, teams have been able to put out extraordinary amounts of output during the pandemic. 

Regarding the challenges of remote work, the panelists noted that working from home does tend to magnify certain behaviors. It can be difficult to interpret and address morale and attitude issues remotely. The panel agreed that although a return to fully in-person setups did not seem conceivable, some level of face-to-face contact and interaction is necessary.

The panel remarked that as they hold cautious hope for the world to reopen in 2022, there are many learnings from the pandemic that they hope to carry forward in their return to normalcy. Considering the success of remote work and how virtual events have enabled a wider audience to participate, the future of work and events would most likely involve a hybrid environment with in-person and online formats. 

In summary, there was a clear consensus that the pandemic has transformed the way publishing works irrevocably and, in many ways, for the better.

Join the Publisherspeak community to access more insights from the Journals Think Tank!

About the Publisherspeak Think Tank

The publishing landscape is evolving rapidly with time, giving rise to exciting developments and new challenges. The Publisherspeak Think Tank brings together experts from diverse areas of the publishing ecosystem, to share their experiences and insights to adapt to challenges and adopt industry trends.

An open conversation around open access

Open access

An open conversation around
open access

Open access

Open Access Week 2021 stimulated many discussions across the world pertaining to the theme, “It matters how we open knowledge: Building structural equity”. In light of this year’s event, we discuss open access (OA) and open research with

Lisa Walton, Global Publishing Strategy Manager at BMJ, a global healthcare knowledge provider with a wide variety of products and services; 

Bryan Hibbard, Journal Editorial and Production Manager at Society of Petroleum Engineers (SPE), an independent, nonprofit global society focused on the upstream oil and gas industry; and 

Melissa Harrison, former Head of Production Operations at eLife, a non-profit organization that works across publishing, technology, and research culture.  

Where are we now in terms of opening knowledge equitably and what needs to be done? What can publishers do to build a sustainable open ecosystem?

Melissa Harrison: Science is a global endeavor, and it should work to improve the lives of all of humanity. Yet this is not achieved, in part because the current scientific enterprise frequently perpetuates inequalities that exclude people from participating in science or accessing its outputs. A diverse scientific workforce is key to solving the complex problems facing society today, but not everyone is welcomed, appreciated or empowered in science. Discrimination and other inequalities deprive the scientific workforce of many talented individuals. Biases – whether explicit, implicit or systematic – often go unacknowledged and unaddressed. Only a narrow set of perspectives, backgrounds, contributions and career paths are commonly valued and respected, which combined with intense competition for jobs and funding, can make scientific and medical research unhealthy places to work with poor work-life balance.

Open access is just the beginning. To truly democratize scientific outputs we need to go further and build open source tooling and apply things like The Principles of Open Scholarly Infrastructure (POSI) to this infrastructure. The roles of funders and institutions are even more important than publishers in creating a sustainable ecosystem.

Bryan Hibbard: Many of the early open access initiatives focused on the largest players as well as those that were well funded. Requiring an author to pay an article processing charge (APC) to publish OA can be prohibitively expensive for many researchers. While hybrid open access journals still offer a free publishing path, they disadvantage those who cannot afford OA. While from a reader standpoint, the OA movement has increased access, it has not done the same for authors. We need to work towards fair models that cover the cost of publication but give equal access to OA for all authors. 

OA models need to be fair to all and sustainable for both large and small publishers. I have yet to see a sustainable Gold OA option. While many publishers using transformative agreements are dependent on research libraries to continue to fund research that they can access for free, I believe this will be the first area that is cut when libraries are asked to contract their budget.

Lisa Walton: Reducing the burden on researchers to publish open access and ensuring equitable access to publish openly both contribute to the sustainability of the open access ecosystem. As part of our work towards that, BMJ is evaluating its waiver policy and working on offering transitional agreements that remove administrative barriers to publishing open access.

Please tell us about the OA initiatives adopted by your journal and the impact these have had on your authors and readers.

Lisa Walton: The research in our flagship journal, The BMJ, has always been free to read, and in 2011 we launched our first and largest open access medical journal, BMJ Open. Today, a third of our journals are fully open access, and we also make academic research freely accessible and discoverable with hybrid publication models. The majority of our hybrid journals have been given Transformative Journal status by cOAlition S. Through BMJ’s OA campaign, we support authors to achieve global impact, broader reach, and exceptional quality. 

In 2019, BMJ co-launched medRxiv with Yale University and Cold Spring Harbour Laboratory. It is the first health sciences preprint server, and it allows fast sharing of preliminary research findings to the widest possible audience. medRxiv now highlights when preprints have been accepted or published across BMJ’s journal portfolio, further contributing to the reliability and value of preprints as part of the scientific record. Journals like BMJ Open Science improve the validity and quality of pre-clinical research through open practices.

BMJ is also part of initiatives like Initiative for Open Citations (I4OC) and Initiative for Open Abstracts (I4OA), making it easier for articles to be found, read, and cited. 

Bryan Hibbard: SPE will be introducing a hybrid open access model in 2022 using traditional APCs with discounts for members and authors from low- and middle-income countries. Open access will be optional, and the subscription model will still be available to all.

Melissa Harrison: eLife has been open access from the outset so we focus our attention on open science, which includes reproducibility, open data, open software and so on.

eLife, via Mark Patterson, was one of the organizations that spearheaded DORA (The Declaration on Research Assessment) recognizes the need to improve the ways in which researchers and the outputs of scholarly research are evaluated) and I4OC (initiative for open citations). We have since supported I4OA.

We ensure our content is machine readable and we submit as much metadata as we have and is allowed in the Crossref schema. Crossref APIs are used so far and wide and so much is being built upon them that we feel it’s important to make as much available as possible there, hence the I4OA and I4OC initiatives.

Editorial policies and eLife staff QC support open science, and we publish transparent reporting forms, key resources tables, data availability statements, as well as encouraging open underlying code and data that supports the research. We support many other initiatives, including the CredIT taxonomy, FAIR sharing principles, the Open Funder Registry, and ROR. From all of this, we also generate full text XML that semantically demonstrates these initiatives as well as developing standardization across corpus XML via JATS4R recommendations, and delivery downstream to indexers, and Crossref.

What role do you think DEI initiatives play in making research accessible to a wider audience?

Lisa Walton: DEI initiatives can give authors and researchers equal support to get their work published, regardless of their sex, gender, race or ethnicity, first language, sexual orientation, religion, beliefs, disability status, age, status, nationality or citizenship. This approach serves to dismantle the barriers that have previously prevented women and underrepresented groups from being published, having a voice or advancing their careers.

Melissa Harrison: Fortunately, there is increasing recognition of these issues, and a willingness to face them. As a publisher and organization looking to reform research communication, eLife has the ability to influence the community and promote greater equity, diversity and inclusion in research and publishing.

eLife’s Community Ambassadors programme and Early-Career Advisory Group are active in helping us change and we have targets to diversify our editorial board and staff composition. We have introduced a code of conduct for all eLife interactions, as well as a social media policy, to ensure all voices are heard and for mutual respect to be adhered to. There are many other areas of activity ongoing and in development.

Where do you see Open Access in the next few years? What are the changes you anticipate?

Bryan Hibbard: I expect many more new ideas to appear in the OA field and expect to see a crystallization of what actually works in OA. I think we are still in the very early stages, and publishers are throwing ideas at the wall to see what sticks. I do believe that many of the current models that publishers are pursuing will turn out to be unsustainable. This is one of the reasons that we are dipping our toe in, so to speak, by utilizing hybrid open access and waiting to see what comes next.

Lisa Walton: BMJ supports and promotes a future built on the principle of unrestricted access to the outputs of medical research, encouraging scientific discourse and facilitating further medical advances. We expect to see increases in the proportion and amount of research published open access.

Melissa Harrison: Any new journal being launched is open access. The challenge is to flip the long standing hybrid or closed access journals to an open access model. Funders have signaled their commitment to open access and initiatives like Plan S are pushing the needle further. The key issue will be to change the whole business models around publishing and for this to be achievable for any journal, and not just monopolized by the big publishers.

Events like Open Access Week 2021 are critical for the growth of the global research community as it encourages all stakeholders to discuss pressing issues. We are grateful to Lisa Walton, Bryan Hibbard, and Melissa Harrison for sharing their insights with us!

 

Image courtesy: Business vector created by vectorjuice – www.freepik.com

Leveraging XML data validation

Publishing Health Records

Leveraging the power of XML data validation
XML data validation

In Europe, hospitals are required to submit health records of their patients to the government through an institute. Proper submission of these records helps the government deliver benefits to patients with medical insurance and understand the medical conditions of the patients in their country.

The problem with the submission system

The hospital recorded all its patient information in a *.csv document to the institute’s server. The .zip file that was shared with the institute contained records of thousands of patients. If even one record was incorrect, the institute had to reject the entire package.

For example, some of the entries in the “date” field were either invalid or inaccurate, such as ‘Last Sunday’ or the ‘32nd March’. There was no system in place to check whether the data were valid.

This would put the responsibility on the hospital to sort through every entry and ensure that it is within the appropriate parameters. Correcting this content was a time- and labor-intensive task for the hospitals.

The government institute began to look for a solution because it was their responsibility to ensure that the government received correct information from all the hospitals. Upon doing their research, they came across XML and thought its validation power could help cure the data.

Reigning in the stakeholders

The project involved many stakeholders, such as hospital administrators, data analysts, and lawyers from the institute and the government. The scope of XML created an ambition among stakeholders to optimize their work.

Each stakeholder had different requirements based on their role in the process. For example, the data entry executive from the hospital would want the schema organized in a manner that directed the users to correct any incorrect data entered by them. The lawyers were concerned about the legal requirements around processing, storing, and sharing data. The data analysts were focused on checking criteria like the minimum number of visits needed for refunds. Due to these varying requirements and interests, each hospital had a unique opinion about the model they would like to provide to match their systems. 

Therefore, the solution had to be approached through the following steps: 

  • Define the information model to bring all the stakeholders into the picture. 
  • Derive the XML schema based on the information model.
  • Determine what data needs to be entered by the hospital to ensure that the necessary information was submitted.
  • To accommodate the different requirements of stakeholders, a small ‘transformation’ was provided, enabling hospitals to provide data the way they wanted to, but at the same time, ensuring that all the legally required information was included.

Implementation of XML Schema

The hospital database stores data as *.csv files in a flat, tabular structure. XML documents, on the other hand, are hierarchical, which means that some entries are subsets of other entries.

In this project, the data that was first entered in a flat structure had to be transformed into a hierarchical structure and later fed into a database with a flat structure again. But the process of loading the XML directly into the database was riddled with issues. So, an intermediate, generic XML was created to mimic the *.csv structure. This made it easy for the XML data to be loaded into the database.

What was the value?

Now, there are several hospitals that no longer need to sort through thousands of entries when the large .zip file is rejected by the server. They know exactly which record is incorrect and have the means to correct it without spending several man-hours.Hospitals that require data validation for other projects can now use this model to verify all their projects. Moreover, there is scope for expanding the business rules for validation to further improve data quality.

What can publishers take away from this?

In this project, XML was chosen specifically for its validation powers. Publishers are in the business of validating information and making it accessible to their readers. Therefore, publishers need to ensure that their data is valid and accurate.Much like the government institute in this episode of XML Stories, journal publishers also stand to gain from XML. In the case of journals, there is a need to ensure that the dates and references in articles are correct. For example, when a chemical substance is referenced in a journal from an existing list, the power of validation can ensure that the references are accurate, thereby creating richer metadata that makes content easily accessible and visible to the right people.

Publishing Health Records

Click here to watch the full story. 

Optimize book revision cycles with XML

Publishing Textbooks

Optimize revision cycles and publish digital collaterals with XML
publishing turnaround with XML

About a quarter of the world’s population is under 15 years of age, and the majority of them are in school. One of the successful and time-honored ways by which education is imparted to them is through textbooks.

In the case of the educational publisher covered in this episode of XML Stories, the publisher mainly produces books for school children. They also publish some books for university students. Their authors are school-subject specialists and university professors. So, the content falls within the scope of not just school students but also advanced material for university students. The common factor is that all published materials are educational.

The publisher wants to print high quality educational material on a nicely printed book. To accompany the book, they also want interactive features such as DVDs and websites.

What was the challenge?

Most authors were not comfortable with using FrameMaker to edit content. As a result, the book had to be physically printed and given to authors for editing. The edits were made on sticky notes and returned to the typesetter for correction. This feedback cycle would occur a few times. In addition, there was an editorial and redaction cycle. This classical workflow is not cost-effective, especially when a book has been written by multiple authors. So, the publisher began searching for solutions to address these challenges.

What kind of solution was the publisher looking for?

Initially, the publisher was hoping for a single source for their printed books and the collaterals that come with it, like the website, animations, videos, references, etc. They did their research and discovered the benefit of XML: create in XML and publish in a wide variety of formats.

When the publisher discovered the scope of XML, they wondered if they could also find a solution for their revision cycle issue, where subject matter experts and authors were reviewing books from a print delivered by a FrameMaker typesetter.

What is the result of shifting to XML?

The authors, editors, and other stakeholders involved in the revision process no longer need to work with physical materials. They use a virtual, WYSIWYG interface to complete their edits without the need for typesetters to revise every version of the book.

In the earlier FrameMaker setup, the publisher created interactive content in the form of CDs and websites. However, this process had limitations.

The publisher desired greater control over the scope of the users’ interactive experience, and this level of control could not be achieved in the FrameMaker workflow.

For instance, there may be a case where students need to be guided from one exercise to another only after completing the first. This can be achieved only with the features and facilities of XML. Many digital publishers in the education field have entered the space of creating learning modules. A background in XML empowers the publisher with the tools they need to build such modules.

In this case, the textbook publisher was looking for a solution to a specific problem related to content revision cycles. When the publisher explored the scope of XML technology further, they were able to reimagine the technology as a solution for various other publishing workflow-related optimizations as well.

Publishing textbooks

Click here to watch the full story.

Increase publishing turnaround with XML

Publishing Drilling Manuals

Increase publishing turnaround by 2500% at no additional cost
publishing turnaround with XML

What’s in a drilling rig manual?

Drilling rigs are used for a variety of purposes such as mining, taking samples from the ground, studying rocks by drilling hundreds of meters beneath them, and more. Depending on the task, a user buys different types of drills with different platforms, and sometimes, they might require a different type of tower or power supply setup. Drilling rigs are portable devices made up of several individual machines. They are disassembled and reassembled at every point of use.

When a user buys a drilling rig, they are entitled to a manual as per their specifications and language requirements. These manuals are necessary because many drilling operations occur in locations that are far from the reach of mobile towers and service specialists. Although the company that makes these rigs sold over 300 a year, they were able to deliver only a dozen near-specific manuals. This meant that many customers, despite paying millions for the product, would not receive the manual suited to their exact needs based on their selection of drilling equipment.

What’s the scope of delivery?

The manufacturer sourced the job from a publisher who edited the manuals and created the layout as per specifications. As each manual was hundreds of pages long, the copyeditor had to be conscious to find every instance in the book where corrections were required. This work was tedious and subject to human error. Every time there was a change in the content, the layout specialist also had to come in and edit every page.

Considering the number of pages and the size of the task at play, it is evident why the publisher found it difficult to deliver the required 300 manuals per year.

What was the challenge?

A variety of tools can be used to create a document like a product manual. One might use a word processor such as Microsoft Word or an advanced tool for technical authors such as Adobe FrameMaker.

The challenge lies in the fact that these tools are proprietary products; the user’s ability to perform operations like translations is heavily dependent on the program’s inherent functionalities. At the time of this project, the publisher needed to publish manuals in CJK characters and right-to-left languages, but Adobe did not provide this facility. Therefore, an equipment user in Iraq or China would have access to only an English language manual.

There are other challenges as well. For example, if the content written in English is 100 pages long, the same content in another language may require up to 50% more space. This means that the layout specialist would have to prepare the entire document from scratch. To solve these problems, XML was brought into the picture. Another challenge was changing the mindset of the authors and having them accept that they can’t know how the final content is going to look in different output formats while they’re working on it, considering the variety of ways in which documents are accessed today.

How does XML address these challenges?

When the publisher uses an XML-first approach, they make a one-time entry for all their reusable content. For instance, the function of a diesel engine is the same, regardless of the drilling rig setup. The text that describes the use of the diesel engine remains uniform across the board.

However, the diesel engine may vary in size. So, every instance where the size of the diesel engine is referenced is identified and profiles are built around it. Every time a new diesel engine is ordered, the publisher only needs to create a profile of the engine with specifications and images, and the appropriate details are automatically entered into the manual.

The other benefit of the XML approach is that layouts could be built as templates, and once the text was finalized, it was automatically laid onto the manual. Therefore, the page breaks and layouts were done as per coded specifications.

Does this solution take away the artistic touch?

When this transition occurred, copyeditors and layout specialists expressed concern because some page breaks and image positions were laid out differently than they would have preferred.

From the author to the page designer, each person has a unique style with which they craft the manual. They had established a workflow when they were working on each page individually with FrameMaker. But when XML was brought in, they no longer had to work on pages individually. With the XML solution, the publishing team now had the opportunity to focus on improving the quality of language, images, and diagrams instead of focusing on the layout.

Therefore, the solution that enabled the publisher to accelerate their workflows also allowed them to spend more time on improving the content.

What was the business outcome?

The drilling rig manufacturer now has very happy clients who have access to a manual that is highly suited to their specific customization.

The publishing team of authors and designers that previously published around a dozen manuals annually now publishes over 300 manuals a year in 17-18 different languages, including CJK characters and right-to-left languages using the same style sheet and content. The quality of content has also improved as the team no longer needs to relay every version of the manual. The team was able to publish the drilling rig manuals in different formats such as print PDFs, eBooks, HTML for web versions as well as mobile- and tablet-friendly versions.

Click here to watch the full story.