“ Its as though the creative process is no longer contained within an individual skull, if indeed it ever was. Everything today is to some extent the reflection of something else “. William Gibson . (‘ Pattern Recognition ‘ , 2003)

We are now fairly used to AI . There is nothing very unexpected then about intelligent systems that write sports reports or business news . Or fully synthesised AI voiceovers in advertisements , which may also be created entirely by CGI . Creating new Beatles-alike pop songs is likewise six years in the past for the Sony CSL Lab Flow Machines . Even replacing the missing pieces of Rembrandt’s ‘Night Watch’ or creating a wholly new ‘Rembrandt’ seems not entirely unusual .Indeed, with GitHub’s Co-Pilot , we even have a picture of the machine sitting at our elbow writing the code that we were thinking about writing , a vision of some future scenario of autonomous machine creativity .  Given enough data , enough machine intelligence and enough machine learning capacity , and we believe that anything is possible . So why , in terms of our laws and regulations , do we fail to register the capacity of intelligent machines to generate original and creative work which cannot belong entirely to the owners of the machines or the writers of the programs , or the owners of the data ? 

A very informative seminar organised by IBiL, University College , London , tackled the issues surrounding copyright and AI last week . Excellent speakers from the UK Intellectual Property Office and from the music rights licensing body laid out the current steps to review the law and the conservative view of protecting the livelihoods of living creators . All of this is clear and understandable . We know that the law will always be five years behind the front line of innovation , and we certainly want the rights of the live creators of original works to be protected . But there is also a real worry here that we will delay or inhibit development work vital to the growth of a strong AI-based sector of the economy , and within it the creativity and originality that should be associated with this. Tobias McKenney of Google spoke graphically of the iterative processes of modelling , adjusting and remodelling and wondered if current provision protected the model , or the process , or the output ?  He also pointed out the need to regulate against bias , and the use of selective data , through audit and certification, and for the protection of AI creativity to be global . Martin Senftleben, from Amsterdam University , proved a fascinatingly different professor of IP in that he argued that the objectives of our society were more important than the legal objectives , and saw copyright as something aimed at restricting acts rather than encouraging them . So perhaps AI creativity did need a boost , and perhaps a new neighbouring right , a ‘single equitable payment ‘ , was needed to secure the data availability that would in turn stimulate development . 

The quotation at the head of this piece reminds us that all art , and science , is derivative in some sense . Not only do we stand on  “ the shoulders of giants “ but we look through their eyes and make use of their brains as well . Perhaps it would be better to shelve the debate on whether a machine can be original and creative , and concentrate on ensuring that the data is available and licensable to make AI the effective boon to our society that it certainly can be . This means arguing for something like a re-use right which makes data holders explain why their data cannot be used , neighbouring rights around standard terms and payments to licensing societies on data re-use , and standard core terms for data and text mining licences . Let’s get this up the agenda , and leave the arguments about ownership , creativity and originality until later, until we are prepared to debate whether a machine can have a legal personality – or , indeed , a simulacrum ot consciousness .  

REFERENCES

Seminar :

https://www.ucl.ac.uk/laws/events/2021/jul/ai-and-copyright-what-next

 OR

https://www.rijksmuseum.nl/en/stories/operation-night-watch/story/night-watch-the-missing-pieces

https://www.nextrembrandt.com

AI voice synthesis www.WellSaidLabs.com and www.VocaliD.ai

https://www.youtube.com/watch?v=LSHZ_b05W7o  Daddy’s Car Flow Machines Sony CSL

NLG data narrative platforms.  www.narrativescience.com ; www.automatedinsights.com ; www.yseop.com;  www.primer.ai ; www.arria.com

https://copilot.github.com

I have been accused of being a Techno Utopian. And , on reflection , I am delighted . In the first place , it makes me sound as if I know a great deal more about technology than I actually do . In my business that cannot be a bad thing . Then again , it labels me as an optimist . And I am an optimist . I do not go as far as the late Dr Pangloss in an unreasonable belief that all is for the best in the best of all possible worlds – no reader of this column could accuse me of that – but I do believe that it is within the span of man’s accomplishable ambitions to improve the quality and impact of science scholarly  communication . And in particular address worrying  issues of reproducibility , data availability and metadata dissemination . Thus I believe that Open Access is here to stay and that Open Science will open up the way to further fundamental changes in the way science and its findings  are reported and digested within the scholarly community . My accuser levelled this techno-utopianism  accusation at me in response to a comment I made on LinkedIn . I am told that he has developed the charge in his own subscription newsletter . Sadly I have not been able to read this  since   he threw me off his subscription list some years ago !

What I am called is fun, and funny , of course , but far less interesting than what happens in this critical marketplace . Nothing should distract us at the moment from looking carefully at what has happened during the pandemic . In particular we should perhaps be looking at the way in which the pre-print movement has developed in the last 18 months  . From the reports that I have seen activity levels have been high , predictably in the life sciences , but also elsewhere. Getting work “out there “ seems to have been an understandable priority for many researchers in uncertain times . It did not lessen a concern to be ultimately published in high brand , strong impact factor journals . We can connect that continuing wish to what many see – though there are no numbers to guide us – as a growth is the frequency  of “pre-validated “ articles  going onto pre-print servers. “ Pre-validated “ could mean pre-checked for plagiarism , or to correct references or citations , or to improve standards of English expression – the whole variety of checking mechanisms deployed by service support companies like Cactus Global or ResearchSquare. Applied at scale , the quality of manuscript submissions will improve . Some publishers will see this as a cost reduction advantage : others will see a risk to the ultimate control of process by publishers . If you threaten that , they will say , you risk the one thing that keeps the publisher in the game – the certification of the research process by virtue of control of the version of record. 

Besides increasing activity on pre-print servers , something else has been happening during the pandemic . More and more so-called “ transformational agreements “ between publishers and institutions have been signed and a variety of terms exhibited . However , we are yet to get into a renewal cycle and it might be wise to speculate a little on what the learning experience may be for the not- unsophisticated negotiators on the university and institutional side of the table . The deals and the process may predictably move power of discrimination of where publication takes place out of the hands of individual lead researchers and place it in the deal making bundle . Institutional negotiators will be no less keen than individual researchers on creating acknowledgement of worth and enhancing reputation . But an institutional view might be a percentage view . Imagine a negotiator saying that “ we estimate that 60% of the research we produce should reach your highest impact factor journals – but if we are wrong , give us a compensatory discount greater than the savings on APCs . “  Or even “ Our researchers now expect a full value pre-publication manuscript preparation and vetting service , including exposure on a pre-print server while decisions are made . While you have first publication options , they will be time limited so that other publishers can adopt pre-prints from your servers when options to publish are exhausted. “

These future negotiations will become very complex . Some institutions , especially in medical research , will probably evolve single publisher relationship , which may make renewal negotiations very difficult at times . Others may decide that it is easier to work with two or  three larger players who have a range of journals at various levels of impact that fit their requirements . This not only further squeezes the societies , many of whom predictably will follow the IET-Wiley route , but will also impact the small specialist publisher . And when the negotiations begin to include the wider requirements of Open Science – pre-registration , for example -the pressure on publishers to shoulder a wider part off the process and its contingent  costs will intensify . The winners , whatever their scale , will be the full service players who , aided by ever-improving technology , take on the widest responsibility for taking scholarly communication from the research bench to its definable  , mapped and aware readership . These will be difficult days for those who want publishing to retreat to its traditional bounds . 

Now , Reader ! Was that too “techno” for even the most delicate of digestions ? Or too Utopian for even the most pessimistic ? Be my judge !

« go backkeep looking »