We are resellers for various academic publishers. One of the things we do is send out email campaigns to our lists of researchers and librarians. All very boring. One of the main roles for the marketing assistant was writing the copy for the email. Here are 10 reasons why our new AI eBook collection is so much better then the last eBook collection I sold you a month ago type of thing. Generative AI is pretty good at "writing" that sort of thing with minimal editing needed. However, it's a very simple task its doing, basically summarising marketing material and publisher websites into snappy emails. What we have now with Generative AI is a long way from Artificial general intelligence (AGI) that could match or surpass human capabilities across all cognitive tasks. As it happen we sell a **** ton of books on AI all written by humans....so far.
Shouldn't get totally carried away with AI being the future, but we can at least rely on it for the simple stuff ...
Pretty standard und equivalent to new graduates nowadays ever since universities became businesses and not centres of excellance https://www.instagram.com/reel/DMf0cKQN2VX/
I’ve been using my companies co-pilot as a research assistant and it’s been doing a great job, basically better than Google search as it will summarise and tabulate results. I find as long as you get it to cite sources so you can check that it is reasonable accurate then it’s great then i fed it a lengthy pdf document and asked it to extract a certain table as i needed to use it in a Word document and update it. Total disaster and I got my first example of AI hallucinating where it basically made stuff up. I spent so long arguing with it to try and get it to do the job that I could have recreated it myself from scratch. So I’m left with using it as a better search engine, improving written English, basic summarisation. I’ve no doubt it will improve. I do worry though that I started off my career doing all the basic research for more experienced people and producing tables of information, filling in forms and summarising. If we use AI to do this then how are young people going to get the experience that they build in to end up in my job in 30 years time. If I’m honest I think my job will probably be obsolete in 5 - 10 years, which is ok by me as I’ll be looking at retirement.
I had a Zoom meeting today, to discuss new AI options that could be added to my website, there's some clever stuff going on. Most websites have a search bar to find things on a site, there's a new AI one that works exactly the same, but it records everything a particular customer searches for. The new system also records all information about the products a customer looks at, how long a customer looks at a specific product and even sends an automated email specific to that customer, referring specifically to the item they looked at for the longest, offering a discount etc if that's what you want. The volume of data that these things collect is incredible, you can see an enormous amount of information on what is and isn't working. If I had time to work out how to use it all, it would probably be brilliant.
I can see how that might drive a customer's buying decision, which is great for you but it would make me feel a little uncomfortable, as though I should be looking over my shoulder all the time. Having said that I'd imagine these facilities are introduced by stealth so they might already be influencing me and I haven't really noticed! I've been looking at new golf club sets past few days and I've not had any targeted communication yet, and certainly no discount offers!
Strictly speaking, a lot of what is discussed is more machine learning than it is AI. The size and speed of computers has enabled this, but most is a long way from actual 'intelligence' as they only operate within the parameters set.
Yes. What OLM is describing is CRM (Customer Relationship Management) which was a big thing back when I worked in IT in the nineties. CRM became a big thing back then not because someone had just invented it, but because the cost of processors and storage was going down at the same time the power/speed/capacity was going up, plus the advent of parallel processing. A lot of what is described as AI now is nothing particularly new, but it's just possible now because of advances in chip technology and solid state storage. Jesus, I even bored myself there.
Whenever I know a lot about a subject and use AI I am usually appalled by the results Even if it doesnt get anything wrong I dont usually like the nuances It makes me scared to use AI unless I am willing to do a lot of editing I usually prefer to learn about a subject and add to my knowledge rather than hope for the best I accept that some people can think of ways to use AI when there is no way they have the time to do it themselves
The whole meaning of the term AI has become very loose and poorly defined. It's been a term for decades, but since ChatGPT blew up it's been used very differently. Even within software companies (I work for one) it's used as a buzzword without people really knowing what they're meaning or what they want to achieve. People are just aware there's this powerful thing taking off and they want to be involved with it, without really understanding it or being able to say why they want to be involved with it.
We are a fair way from actual artificial intelligence I would like to see it one day Preferably with robots with artificial skin Wait.. where have i seen this before
I believe that will cause a big lack of critical thinking skills Amongst other stuff Theres some very interesting studies on it
You see that on here sometimes. People posting that they asked ChatGPT about the intricacies of the EFL appeal process. If it's not publicly available then clearly an LLM isn't going to be able to tell you anything useful. But I guess some people think it just somehow knows everything.