• Keep Writing
  • Posts
  • Microsoft's AI-Powered Copilot Inside MS Word

Microsoft's AI-Powered Copilot Inside MS Word

Yesterday, while I was drafting a new academic writing workflow, Microsoft broke the news that it was working on integrating an AI-powered Copilot with MS Office. Think of it as having ChatGPT inside MS Word.

This is a huge development with serious implications for the future of academic writing. It’s both urgent and important that we start a discussion about what this integration is going to mean for us as academics.

That’s why I am going to talk a bit about it today. The new writing workflow will have to wait until next week.

While Microsoft plans to integrate their Copilot with all apps available in MS Office, what concerns us is its integration with MS Word, an application that is used all over the world by academics for writing journal articles, dissertations, and many other forms of scholarship.

Unlike expensive Apple products that are mostly used in high income countries, Microsoft’s software are used both in high and low income countries where often times these apps are made available through (undetected) piracy. A student in rural Pakistan may never use an Apple product in their life, but they will use MS Word at some point.

According to Jared Spataro, Corporate Vice President of Microsoft, “Copilot [in MS Word] gives you a first draft to edit and iterate on — saving hours in writing, sourcing, and editing time.” Spataro is cautious enough to point out that “sometimes Copilot will be right, other times usefully wrong — but it will always put you further ahead.”

What would it mean to have a first draft of a journal article or a dissertation chapter, which we can tweak, revise, rewrite? Will it be considered an AI-generated work or a human-generated? Will it be considered an instance of plagiarism? These is an urgent need to discuss and debate these questions.

AI-powered apps like Jenni and Notion have already integrated versions of ChatGPT. If you type a word in Jenni, it will suggest you not just the next word but a whole sentence based on that one word, then a sentence based on your previous sentence, and on and on and on.

Notion, one of the most powerful and user-friendly writing apps, has an AI-powered writing assistant that you can use to brainstorm ideas and get started on a piece of writing.

The business (yes, it is a business in which publishers make millions of dollars) of academic writing is going to change dramatically right in front of our eyes.

A few days ago, Cambridge University Press (CUP) announced that “AI use must be declared and clearly explained in publications such as research papers, just as we expect scholars to do with other software, tools and methodologies.“

Note that the CUP did not state that authors are not allowed to use AI-powered apps. Authors are required to declare and explain the use of AI in their works. This shows that there is a level of acceptance of AI-powered apps in academic publishing although the terms of explanation and declaration will have to be constantly revised and updated.

My personal approach to AI-powered apps is to use them as my writing assistants. I have written about it several times on Twitter and LinkedIn. Yesterday, Times Higher Education published an article in which I talked about how I use AI-powered apps like ChatGPT.

You can read my comments below:

“There are smart ways to use ChatGPT and there are dumb ways”

As a Fulbright scholar living in upstate New York, Mushtaq Bilal used to talk to himself on his way to his daily swim. “As I drove to the pool, I’d have some good ideas about my next paper, so thought I should record and transcribe them,” recalls the Pakistani academic, now a postdoctoral scholar in world literature at the University of Southern Denmark. “That would give me a zero-draft, which I could work on.”

That method has, however, become much easier with ChatGPT, says Bilal, whose tips on how to use the AI tool for academic writing have been viewed millions of times on Twitter, pushing his following past 125,000. “You can tell it to remove redundant words, create coherent sentences and cohesive paragraphs,” he tells Times Higher Education. This method can compress a block of 13,000 words – the result of two hours of dictation – into a presentable 3,000-word essay, which, with another two hours of scholarly fine-tuning, is ready for submission.

Bilal rejects the suggestion that using AI is cheating. After all, the ideas are his own and the final draft will require many small edits to pass muster. “I’m a big believer in making notes by hand and I read very slowly, but I recognise that getting that first draft can be difficult,” he says.

To strike the right tone, Bilal imagines he is talking to a “very keen, intelligent first-year undergraduate”, even when instructing ChatGPT to improve text. “I talk to it politely…I say ‘please do this,’ even though it’s a machine. You need to have a positive relationship with it.”

With ChatGPT and other AI language tools here to stay, the challenge is how to use them intelligently and ethically, continues Bilal. “There are smart ways to use ChatGPT and there are dumb ways – one of which is asking it questions,” he says.

“Asking for answers is the wrong approach – you need to ask it to give you questions, so you can create your own answers,” he explains. For example, he recently asked ChatGPT to imagine 10 questions that might be asked in a Fulbright interview, reflecting on how he was initially rejected for the prestigious US scholarship before securing a PhD place at Binghamton University. “Of these 10 questions, I was asked eight,” he reflects.

Having these tools will help to level the playing field for those unacquainted with the unspoken rules that govern many parts of academia, says Bilal. “If you want to prepare an application for a US graduate school, there are things like a statement of purpose that I had no knowledge of,” he explains. “It took me 12 to 18 months to find the information and fill in an application, but you could easily ask ChatGPT to write a statement for you. It would be foolish just to submit this because it would have no personality, but it would give you the right structure on which you could build.”

From writing considerate rejection letters to more punchy Twitter threads (he recommends WordTune), Bilal has found numerous ways to employ AI in academic life. While he acknowledges it might not suit everyone, academics should be open to its benefits: “If you are willing learner – as most researchers are – why wouldn’t you want to learn this craft?”

Jack Grove’s interview of Mushtaq Bilal published in yesterday’s Times Higher Education

That’s it for this week.

I’ll see you next week, hopefully with a new writing workflow.

Until then keep writing.

Join the conversation

or to participate.