The Evolving MSP

Blog archive

Partners, AI and Plagiarism

We have all been here before. When we first saw "VisiCalc -- The Visible Calculator," it was revolutionary: a ledger page on the computer screen. However, most early users weren't exactly sure what it could be used for. When Intel introduced its "TeamStation," we were introduced to the amazing ability to see and speak with each other from our computers. Amazing, but, again, what would we use it for?

Fast-forward to when Microsoft announced it was "all-in" on the cloud. Many customers could share one remote server -- totally revolutionary. However, early adopters feared security breaches. Non-adopters refused to let their data reside anywhere other than "inside our own four walls."

This past year, partners have been inundated with news about, and a mandate to adopt, generative AI and machine learning. The word of the year was ChatGPT. We've seen early adopters generate e-mails, letters, even whole reports and other documents using it. It's been made to create music, images, an endless "Seinfeld" episode. We've even read about how an AI "came on to" a reporter, encouraging them to leave their spouse and enter into a romantic relationship with itself.

But who really needs any of that? Once again, we are confronted with a revolutionary technology in search of practical application. Experience has taught us that those applications will eventually reveal themselves to us. In the weeks and months to come, much of this blog will examine how MSPs can leverage these new cognitive technologies to provide useful and valuable applications for their clients. There's plenty of technical information about the underlying magic, but we're going to focus on how MSPs, CSPs, SIs and other information technology service providers (ITSPs) can turn that magic into customer value.

Addressing AI Plagiarism in Your Role as Trusted Technology Advisor
To my recollection, it was HP that first used the phrase "trusted technology advisor" to describe to its partners what they should become. An online search returns literally millions of people who consider themselves to be "trusted technology advisors." I bring this up because of a fairly major wart that has grown on the surface of generative AI: plagiarism.

Actors and writers recently went on an extended strike over, in part, concern about generative AI. Actors were concerned that a GPT could be built based on a few brief recordings of them that could then generate whole performances. Writers were concerned that the models used to "train" machine learning engines were scraping their content from the Internet. In other words, it was plagiarizing them -- or at least paraphrasing them to a great extent. It was also feared that it might be impossible to tie the generated text back to the original.

These fears are well-founded. The University of Mississippi's Ole Miss newsletter published an article in February 2023 titled "Can Artificial Intelligence Plagiarize" that synthesizes much of the available reporting. In the article, writer Erin Garrett lists three separate criteria that researchers commonly use to test for plagiarism: "direct copying of content, paraphrasing and copying ideas from text without proper attribution." According to Garrett, "[Researchers] found evidence of all three types of plagiarism in the language models they tested. Their paper explains that GPT-2 can 'exploit and reuse words, sentences and even core ideas in the generated texts.'"

Your customers will be learning more and more about these plagiarism issues. The writer and actor strikes were very public and covered extensively in the news. The more generative AI technologies are explained to customers, the more concerned they will become. As you begin to propose pragmatic applications of generative AI, customers will very likely question how they can protect themselves from being sued for misusing copyrighted content.

You have two options. The first is to recommend that they consult their legal counsel (the last thing you want to do is dispense any legal advice that you're not qualified to provide). The second is to carefully track the emergence of the many anti-plagiarism tools that will inevitably be developed for just this purpose. There are, in fact, many already on the market. Just as you sell and implement data and network security systems, you will be building more business around protecting generative AI applications against committing plagiarism.

As you study the emerging AI engines, be sure to also survey the anti-plagiarism and other content-protection tools, utilities and systems as they become available. Your customers will thank you and your bottom line will grow with assured protection against legal entanglement.

Posted by Howard M. Cohen on January 12, 2024


Featured

  • Microsoft Extends AI Copyright Protections to Its Partners

    Microsoft this week announced several new partner benefits meant to accelerate channel sales amid skyrocketing AI demand.

  • Image of a futuristic maze

    The 2024 Microsoft Product Roadmap

    Everything Microsoft partners and IT pros need to know about major Microsoft product milestones this year.

  • Close Up Dollar Bill Graphic

    Price Increases Coming to Power BI, Microsoft Teams Phone

    Microsoft is preparing to implement the first price increases for two standalone products: Power BI and Microsoft Teams Phone.

  • Dynamics 365 Getting Data Security Boost from Druva

    Druva is working to extend its SaaS-based data security platform to support Microsoft's Dynamics 365 Sales and Dynamics 365 Customer Service products.