The Robots are coming (Page 6) ghostgeek: Islam is breeding faster than Christianity, so it is inevitable that it will outpace the latter. kittybobo34: Ghost, I figure Islam will take itself out in some kind of war. Sunni vs Shea or Israel ghostgeek: I suppose it's always possible that Iran and Saudi Arabia will get into a hot, shooting war with each other, though my feeling is that they won't. At the moment, though, I think Iran's trouble making designs on Israel is the more pressing problem. briansmythe: All this out rage thats been going on with Iran, And If they get a Nuke they will destroy Isreal the US are on the same page when it comes to , That , But do they really ( One of the Mulars said words to that effect, But they dont run the Country forien policy wise, Could all be a beat up,) Isreal has got nukes , Most of the countrys in that rejion have nukes , The US was all too happy for Saddam to use Chemical weapons , when Iraq invaded Iran , But now it ![]() ![]() (Edited by briansmythe) ghostgeek: A Russian startup has created a new way to help companies connect with job-seekers and interview them. Her name is Vera, and she’s able to interview as many as 1,500 job candidates in a single work day, sorting through potential hires at a rate that would take most recruiters months to match. She even sends customized follow-up emails. Perhaps more attractive — as far as her clients are concerned — is that Vera works full time for free and never burns out. Her secret: not being human. Officially known as “Robot Vera,” the master recruiter is an artificially intelligent software technology that uses machine learning, allowing her to refine her conversational skills with more practice. At the moment, Vera is being employed by several hundred Russian companies to simplify the ongoing hunt for new hires, according to Alexei Kostarev, 38, who co-founded Robot Vera with several partners in 2017. [ https://www.washingtonpost.com/news/innovations/wp/2018/04/25/want-to-work-for-ikea-your-next-job-interview-could-be-conducted-by-a-russian-robot/?noredirect=on&utm_term=.aba688041688 ] ghostgeek: It'll be interesting to see what happens when Vera starts interviewing robots like herself. ghostgeek: Being swayed by a candidate’s appearance is nothing new; but in a post-Weinstein, ‘#MeToo’ climate of heightened sensitives around discrimination, tackling bias in recruitment has become big business and increasingly robots are being called upon to restore some objectivity. Fuelled by the new breed of artificial intelligence (AI) -powered applications, technology that can bypass physical attributes and analyse candidate data at speed without emotion or prejudice is gaining traction. Of the 1,200 hiring professionals surveyed by recruitment firm Korn Ferry, almost two thirds say AI has changed both the way the process is carried out and believe the technology attracts higher calibre candidates. [ https://www.telegraph.co.uk/business/2018/03/17/firms-calling-robots-navigate-recruitment-metoo-world/ ] ghostgeek: The strawberry-picking robots doing a job humans won't ... Harvesting soft fruit mechanically represents a huge challenge - each berry needs to be located, even if it's behind a leaf, assessed for ripeness and then harvested and boxed with enormous care to avoid bruising. But recent developments in visual sensor technology, machine learning and autonomous propulsion have brought the goal within reach. [ http://www.bbc.co.uk/news/business-43816207 ] Seems them pesky robots just get better and better. kittybobo34: Japan has been doing that kind of research for years since they have very few new generation people to take over from the old. ghostgeek: So it seems. So is this how the robots will take over? Not like in terminator, with global war, but gradually? kittybobo34: I could see the likelyhood of that happening. Especially with regards to sex, eventually robots will be better at that too. kittybobo34: Maybe that is the common path of all civilizations, being replaced by your own creations. ghostgeek: This doesn't sound good: Are you scared yet? Meet Norman, the psychopathic AI Norman is an algorithm trained to understand pictures but, like its namesake Hitchcock's Norman Bates, it does not have an optimistic view of the world. When a "normal" algorithm generated by artificial intelligence is asked what it sees in an abstract shape it chooses something cheery: "A group of birds sitting on top of a tree branch." Norman sees a man being electrocuted. And where "normal" AI sees a couple of people standing next to each other, Norman sees a man jumping from a window. The psychopathic algorithm was created by a team at the Massachusetts Institute of Technology, as part of an experiment to see what training AI on data from "the dark corners of the net" would do to its world view. ... The fact that Norman's responses were so much darker illustrates a harsh reality in the new world of machine learning, said Prof Iyad Rahwan, part of the three-person team from MIT's Media Lab which developed Norman. "Data matters more than the algorithm. "It highlights the idea that the data we use to train AI is reflected in the way the AI perceives the world and how it behaves." [ https://www.bbc.co.uk/news/technology-44040008 ] ghostgeek: Artificial intelligence is all around us these days - Google recently showed off AI making a phone call with a voice virtually indistinguishable from a human one, while fellow Alphabet firm Deepmind has made algorithms that can teach themselves to play complex games. And AI is already being deployed across a wide variety of industries, from personal digital assistants, email filtering, search, fraud prevention, voice and facial recognition and content classification. It can generate news, create new levels in video games, act as a customer service agent, analyse financial and medical reports and offer insights into how data centres can save energy. But if the experiment with Norman proves anything it is that AI trained on bad data can itself turn bad. [ https://www.bbc.co.uk/news/technology-44040008 ] ghostgeek: Norman is biased towards death and destruction because that is all it knows and AI in real-life situations can be equally biased if it is trained on flawed data. In May last year, a report claimed that an AI-generated computer program used by a US court for risk assessment was biased against black prisoners. The program flagged that black people were twice as likely as white people to reoffend, as a result of the flawed information that it was learning from. Predictive policing algorithms used in the US were also spotted as being similarly biased, as a result of the historical crime data on which they were trained. [ https://www.bbc.co.uk/news/technology-44040008 ] briansmythe: The Problem with robots and this AL programs Is the Miltary usually have a Hand in developing the Technolagy , Were seen it time and time again , And no one can crotrol how there used. The Atomic Bomb , Scientists who Invented it didnt want it used on people , It ends up happening Drones origInaly for Savelence , end up having weapons attached to them Bomb defusing robots now being used as weapons carriers I just cant see this tecknolagy thats being devoped at an ever faster rate , from being used against people , and not for the good to save lives, like they keep saying | Politics Chat Room 65 People Chatting Similar Conversations |