0 Members and 1 Guest are viewing this topic.
The problem throughout is that you are trying to define the solution without defining the problem. My artificial horizon is an adequate virtual universe if the problem is to keep the plane flying straight and level with no visual reference. The GPS moving map adds just enough data if I want to get somewhere, and the ILS gives me a virtual beeline to the runway threshold. Each of these solutions began with a clear statement of the problem.
This thread is another spinoff from my earlier thread called universal utopia. This time I try to attack the problem from another angle, which is information theory point of view.
Learn from the mistakes of others, you can never live long enough to make them all yourself. Groucho Marx
Meet GPT-3. It Has Learned To Code (and Blog and Argue).The latest natural-language system generates tweets, pens poetry, summarizes emails, answers trivia questions, translates languages and even writes its own computer programs.
This summer, an artificial intelligence lab in San Francisco called OpenAI unveiled a technology several months in the making. This new system, GPT-3, had spent those months learning the ins and outs of natural language by analyzing thousands of digital books, the length and breadth of Wikipedia, and nearly a trillion words posted to blogs, social media and the rest of the internet.Mckay Wrigley, a 23-year-old computer programmer from Salt Lake City, was one of the few invited to tinker with the system, which uses everything it has learned from that vast sea of digital text to generate new language on its own. Mr. Wrigley wondered if it could imitate public figures — write like them, perhaps even chat like them.
One of his experiments involved a pop psychologist, Scott Barry Kaufman. The system took in Mr. Kaufman’s name and a topic for discussion: creativity. Then, when asked “How do we become more creative?” GPT-3 responded instantly:I think creative expression is a natural byproduct of growing up in a diverse world. The more diverse the world is, the more you get exposed to different people, to different opportunities, to different places and to different challenges. And the more diverse that is, the more likely you’ll be to be able to put the dots together to form something new. And in many ways, I think if you want to be creative, you have to go for it. If you want to be a writer, you have to write, if you want to be a musician, you have to create music, if you want to be a comedian, you have to create comedy, and the more you create, the more likely it is that you’ll stumble onto some interesting stuff.Later, when Mr. Wrigley posted the paragraph on Twitter, somebody looped in the real Scott Barry Kaufman. He was stunned. “It definitely sounds like something I would say,” the real Mr. Kaufman tweeted, later adding, “Crazy accurate A.I.”
Objective reality contains a lot of objects with complex relationships among them. Hence to build a virtual universe we must use a method capable of storing data to represent the complex system. The obvious choice is using graphs, which are a mathematical structures used to model pairwise relations between objects. A graph in this context is made up of vertices (also called nodes or points) which are connected by edges (also called links or lines).
Rebooting AI: Deep learning, meet knowledge graphsGary Marcus, a prominent figure in AI, is on a mission to instill a breath of fresh air to a discipline he sees as in danger of stagnating. Knowledge graphs, the 20-year old hype, may have something to offer there."This is what we need to do. It's not popular right now, but this is why the stuff that is popular isn't working." That's a gross oversimplification of what scientist, best-selling author, and entrepreneur Gary Marcus has been saying for a number of years now, but at least it's one made by himself.The "popular stuff which is not working" part refers to deep learning, and the "what we need to do" part refers to a more holistic approach to AI. Marcus is not short of ambition; he is set on nothing else but rebooting AI. He is not short of qualifications either. He has been working on figuring out the nature of intelligence, artificial or otherwise, more or less since his childhood.Questioning deep learning may sound controversial, considering deep learning is seen as the most successful sub-domain in AI at the moment. Marcus on his part has been consistent in his critique. He has published work that highlights how deep learning fails, exemplified by language models such as GPT-2, Meena, and GPT-3.
Deep learning, meet knowledge graphsWhen asked if he thinks knowledge graphs can have a role in the hybrid approach he advocates for, Marcus was positive. One way to think about it, he said, is that there is an enormous amount of knowledge that's represented on the Internet that's available essentially for free, and is not being leveraged by current AI systems. However, much of that knowledge is problematic:"Most of the world's knowledge is imperfect in some way or another. But there's an enormous amount of knowledge that, say, a bright 10-year-old can just pick up for free, and we should have RDF be able to do that.Some examples are, first of all, Wikipedia, which says so much about how the world works. And if you have the kind of brain that a human does, you can read it and learn a lot from it. If you're a deep learning system, you can't get anything out of that at all, or hardly anything.Wikipedia is the stuff that's on the front of the house. On the back of the house are things like the semantic web that label web pages for other machines to use. There's all kinds of knowledge there, too. It's also being left on the floor by current approaches.The kinds of computers that we are dreaming of that can help us to, for example, put together medical literature or develop new technologies are going to have to be able to read that stuff. We're going to have to get to AI systems that can use the collective human knowledge that's expressed in language form and not just as a spreadsheet in order to really advance, in order to make the most sophisticated systems."
A fascinating hypothesis is that human and animal intelligence could be explained by a few principles (rather than an encyclopedic list of heuristics). If that hypothesis was correct, we could more easily both understand our own intelligence and build intelligent machines. Just like in physics, the principles themselves would not be sufficient to predict the behavior of complex systems like brains, and substantial computation might be needed to simulate human-like intelligence. This hypothesis would suggest that studying the kind of inductive biases that humans and animals exploit could help both clarify these principles and provide inspiration for AI research and neuroscience theories. Deep learning already exploits several key inductive biases, and this work considers a larger list, focusing on those which concern mostly higher-level and sequential conscious processing. The objective of clarifying these particular principles is that they could potentially help us build AI systems benefiting from humans' abilities in terms of flexible out-of-distribution and systematic generalization, which is currently an area where a large gap exists between state-of-the-art machine learning and human intelligence.
Just a minor quibble why do correspondents write 1021 when they mean 10 to the power of 21 ?There is a perfectly good abbreviation to indicate that you mean to the power on all the keyboards that I have used "^" but maybe the articles are written on a pocket device that lacks this abbreviation .
NEWSLETTER ON LINKEDIN Artificial Intelligence (AI) By Bernard Marr Open this article on LinkedIn to see what people are saying about this topic. Open on LinkedInFuture Trends And Technology – Insights from Ericsson Innovation and new thought is what makes the world go round. Behind all the ground-breaking technologies such as AI and automation are human minds that are willing to push boundaries and think differently about solving problems, in both business and society. Investing in true innovation – how to use technology to do different things, as opposed to just doing things differently – has led to sweeping changes in how we communicate, work together, play and look after our health in recent years. In particular, it has allowed businesses and organizations to get closer to their most important asset – the people who use or consume their services – than ever before. This is thanks to the ever-smarter ways in which we are capturing data and using it to overcome challenges, from understanding customer behavior to creating vaccines.I was fortunate enough to get the chance to talk to two people who are working on this cutting-edge – Jasmeet Sethi and Cristina Pandrea, of Ericsson's ConsumerLab. This is the division within Ericsson responsible for research into current and emerging trends – with a specific focus on how they are being used in the real world today, and what that might mean for tomorrow. During our conversation, we touched on five key trends that have been identified by the ConsumerLab, which has been collecting and analyzing data on how people interact with technology for more than 20 years. One thing they all have in common is that every one of them has come into its own during the current global pandemic. This is usually for one of two reasons – either because necessity has driven a rapid increase in the pace of adoption, or because they provide a new approach to tackling problems society is currently facing. Let's look at each of the five trends in turn. 1. Resilient networks In 2020, more than ever before, we've been dependant on the stability and security of IT systems and networks to keep the world running. As well as the importance of uptime and core stability when it comes to allowing businesses to switch to work-from-home models, It's been shown that cyber attacks have increased dramatically during the pandemic, meaning security is more vital than ever before. Many of the international efforts to trace the spread of the disease, understand people's behavior in pandemic situations, and to develop vaccines and cures are dependent on the transfer of huge volumes of digital data. Ericsson believes that the amount of data transferred has increased by 40% over mobile networks and 70% over wired broadband networks since the start of the pandemic. So ensuring that infrastructure is reliable and secure has never been so important. The fact that network operators have largely been successful at this hasn't gone unnoticed, Sethi tells me – with customers thanking them with a noticeably higher level of loyalty.2. Tele-healthMedical consultation, check-ups, examinations, and even diagnoses were increasingly being carried out remotely, even pre-covid, particularly in remote regions or areas where there is a shortage of clinical staff. However, during 2019 they made up just 19% of US healthcare contacts. Ericsson's research has shown that this increased to around 46% during 2020. This is clearly an example of a trend where the pandemic accelerated a change that was already happening. So it's likely that providers will be keen to carry on receiving the benefits they've generated, as we eventually move into a post-covid world. Here a key challenge comes from the fact that a number of different technologies need to be working together in harmony to ensure patient care doesn't suffer, from video streaming to cloud application platforms and network security protocols. 3. Borderless workplacesWe saw the impossible happen in 2020 as thousands of organizations mobilized to make remote working possible for their workforces in a very short period of time. But this trend goes beyond "eternal WFH" and points to a future where we have greater flexibility and freedom over where we spend our working hours. Collaborative workplace tools like Zoom and Slack meant the switchover was often relatively hassle-free, and next-generation tools will cater for a future where employees can carry out their duties from anywhere, rather than just stuck at their kitchen tables. But this shift in social norms brings other problems, such as the danger of isolation, the difficulty between striking a balance between home and work life, or a diminished ability to build a culture within an organization. Solutions in this field look to tackle these challenges, too, rather than simply give us more ways to be connected to the office 24/7. 4. The Experience / Immersive EconomyTouching on issues raised by the previous trend, Ericsson has experimented with providing employees with virtual reality headsets, to make collaborative working more immersive. Pandrea described the benefits of this to me – "The experience was really genuine, it took us by surprise … we'd seen virtual reality before, but this was the first time where we saw 25 people in the same virtual room, having this experience … when you see the others as avatars you get the feeling of being together, it makes a world of difference."This trend involves creating experiences that mean as little as possible is lost when you move an interaction or event from the real world to the virtual world. Virtual and augmented reality have an important role here, but Sethi points beyond this to an idea he calls the "internet of senses," where devices can feed information to us through all of our five senses. Breakthrough technologies such as the Teslasuit use haptic feedback to greatly increase the feeling of presence in virtual spaces, and is used by NASA to train astronauts. Other innovators in this field are working on including our sense of smell, by dispensing fragrances from headset attachments. Another interesting change related to this field that's been predicted is the rise in the value put on virtual commodities and status versus material goods. Children these days are just as likely to talk boastfully about a rare Fortnite skin, Rocket League car, or Roblox pet as they would about any physical product or status symbol. "If you look at young millionaires they're already driven by virtual status – who has the best status in esports, the number of followers … this trend will be accelerated as we move into the virtual experience economy", Sethi predicts. 5. Autonomous CommerceTwo massive changes to the way we live our lives due to the pandemic have been a big acceleration in the uptake of online retail, and a move away from cash towards contactless payment methods. Cashiers were already being replaced by self-checkouts at a rapid pace pre-2020. But the pickup in speed this year brings us to a point where KFC is operating fully autonomous mobile food trucks in Shanghai. The trucks pilot themselves to customers and serve up socially-distanced meals with no human involvement. The rush to keep up with changing consumer behavior has also sped up the adoption of cash-free and contactless retail, particularly in emerging markets where cash has traditionally been king. Financial services businesses tapping into technology like 5G networking and AI-powered fraud detection tools are responding to new expectations from customers in this field and, if they are able to predict that behavior accurately, are likely to see strong growth in coming years. Investing in innovationRemaining on the cutting-edge of these trends means investing strategically in new ideas and innovation. So we also talked about Ericsson's Startup 5g program, which Pandrea heads up. Here the business looks to be at the head of the pack when it comes to creating the $31 trillion in revenue that it predicts will be generated by 5G platforms and services before 2030. Pandrea tells me that it is expected that a lot of this will come from services that telcos can bundle with their 5G offerings to help make their customers' lives better. One of the star players is XR Space, which is building a social VR platform using its own hardware that could effectively allow workers to take their office (and entertainment world) with them anywhere they go. Another is London-based Inception XR, that enables AR experiences to be created from books to help create more immersion and gamification in children's education. And a third that Pandrea recommends keeping an eye on for a glimpse of the future is PlaySight. It uses AI-powered 360-degree 8k cameras at sports or entertainment events, capable of capturing the action in greater detail than ever before. That data can then be delivered to an audience in any number of ways, including putting them inside VR experiences that let them view from any angle as well as pause and rewind what they are seeing. Underlying technologiesClearly, we can see the common threads of broader tech trends that run through these very relevant trends Ericsson is identifying today. AI technologies, as well as extended reality (XR), which includes VR, AR, and mixed reality (MR), are behind the tools that secure our networks, enable us to work efficiently from anywhere, receive remote healthcare, create immersive experiences and conduct autonomous commerce. High-speed networking is essential to every one of them too, and the quantum leap in upload and download speeds of 5G is necessary to make them all possible. And it's certainly also true that much of the technological progress that is driving real change in business, commerce, society and entertainment has happened in response to the dark times we are living through. But as we start to cautiously look ahead to hopefully brighter days, these trends will go on to play a part in building a safer, smarter and more convenient future. To learn more about any of the trends we've covered here, you can watch our conversation in full here. And you can also take part in the Ericsson Unboxed virtual event that will take place on Wednesday, December 9th. Register or find out more here.Thank you for reading my post. Here at LinkedIn and at Forbes I regularly write about management and technology trends. I have also written a new book about AI, click here for more information. To read my future posts simply join my network here or click 'Follow'. Also feel free to connect with me via Twitter, Facebook, Instagram, Slideshare or YouTube.About Bernard MarrBernard Marr is an internationally best-selling author, popular keynote speaker, futurist, and a strategic business & technology advisor to governments and companies. He helps organisations improve their business performance, use data more intelligently, and understand the implications of new technologies such as artificial intelligence, big data, blockchains, and the Internet of Things.LinkedIn has ranked Bernard as one of the world’s top 5 business influencers. He is a frequent contributor to the World Economic Forum and writes a regular column for Forbes. Every day Bernard actively engages his 1.5 million social media followers and shares content that reaches millions of readers.Join the conversation
Wherein Dr. Know-it-all explains what an "S" adoption curve is, how it has functioned historically for technology like automobiles/cars, the internet, cell phones, and even smart phones. And how it matters a great deal for Tesla and other EV companies who are currently disrupting internal combustion engine (ICE) car manufacturers. Also, what happens when the EV adoption curve lines up with the full self driving (FSD) adoption curve?? Watch and find out!
By the by, as folks have pointed out, and I probably should've noted in the video itself, Tony Seba has been talking about "the tipping point" for years. While I was inspired to work up this video from a Patreon patron, and I don't closely follow Seba, I should have acknowledged that a lot of this is derived from Tony's brilliant ideas over the years. One such video is here: //www.youtube.com/watch?v=y916mxoio0E
IT’s role is more critical than ever in a world that’s increasingly dependent on digital. Organizationsare under increasing pressure to stay competitive and create connected experiences. According to ourConnectivity benchmark report, IT projects are projected to grow by 40%; and 82% of businesses are nowholding their IT teams accountable for delivering connected customer experiences.To meet these rising demands, organizations are accelerating their digital transformation — which can bedefined as the wholesale move to thinking about how to digitize every part of the business, as every partof the business now needs technology to operate. In order to drive scale and efficiency, IT must rethink itsoperating model to deliver self-serve capabilities and enable innovation across the enterprise.In this report, we will highlight some of the top trends facing CIOs, IT leaders, and organizations in their digitaltransformation journey, sourcing data from both MuleSoft proprietary research and third-party findings.
The future of automation: declarative programmingUri Sarid,CTO, MuleSoft“The mounting complexity brought on by an explosionof co-dependent systems, dynamic data, and risingexpectations demands a new approach to software. Moreis expected for software to just work automatically, andmore of us expect automation of our digital life and work.In 2021, we’ll see more and more systems be intent-based,and see a new programming model take hold: a declarativeone. In this model, we declare an intent — a desired goal orend state — and the software systems connected via APIsin an application network autonomously figure out how tosimply make it so.”
2021 will be the year that data separates organizationsfrom their competitors... and customersLindsey Irvine,CMO, MuleSoft“The reality is that the majority of businesses today, across all industries,aren’t able to deliver truly connected experiences for their customers,partners, and employees — and that’s because delivering connectedexperiences requires a lot of data, which lives in an average of 900different systems and applications across the enterprise. Integrating andunifying data across these systems is critical to create a single view of thecustomer and achieve true digital transformation.“It’s also the number one reason digital transformation initiatives fail. Asthe amount of systems and applications continue to grow exponentially,teams realize that key to their success — and their organization’s success —is unlocking the data, wherever it exists, in a way that helps them delivervalue faster.”
You know what the scientific method is until you try to define it: it’s a set of rules that scientists adopt to obtain a special kind of knowledge. The list is orderly, teachable and straightforward, at least in principle. But once you start spelling out the rules, you realize that they really don’t capture how scientists work, which is a lot messier. In fact, the rules exclude much of what you’d call science, and includes even more of what you don’t. You even begin to wonder why anyone thought it necessary to specify a “scientific method” at all.In his new book The Scientific Method: an Evolution of Thinking from Darwin to Dewey, the University of Michigan historian Henry Cowles explains why some people thought it necessary to define “scientific method” in the first place. Once upon a time, he writes, science meant something like knowledge itself – the facts we discover about the world rather than the sometimes unruly way we got them. Over time, however, science came to mean a particular stepwise way that we obtain those facts independent of the humans who follow the method, and independent of the facts themselves.
Just as nature takes alternative forms of life and selects among them, Darwin argued, so scientists take hypotheses and choose the most robust. Nature has its own “method”, and humans acquire knowledge in an analogous way. Darwin’s scientific work on living creatures is indeed rigorous, as I think contemporary readers will agree, but in the lens of our notions of scientific method it was hopelessly anecdotal, psychological and disorganized. He was, after all, less focused on justifying his beliefs than on understanding nature.
Following Darwin, the American “pragmatists” – 19th-century philosophers such as Charles Peirce and William James – developed more refined accounts of the scientific method that meshed with their philosophical concerns. For Peirce and James, beliefs were not mental judgements or acts of faith, but habits that individuals develop through long experience. Beliefs are principles of action that are constantly tested against the world, reshaped and tested again, in an endless process. The scientific method is simply a careful characterization of this process.
This is Biology's AlexNet moment! DeepMind solves a 50-year old problem in Protein Folding Prediction. AlphaFold 2 improves over DeepMind's 2018 AlphaFold system with a new architecture and massively outperforms all competition. In this Video, we take a look at how AlphaFold 1 works and what we can gather about AlphaFold 2 from the little information that's out there.OUTLINE:0:00 - Intro & Overview3:10 - Proteins & Protein Folding14:20 - AlphaFold 1 Overview18:20 - Optimizing a differentiable geometric model at inference25:40 - Learning the Spatial Graph Distance Matrix31:20 - Multiple Sequence Alignment of Evolutionarily Similar Sequences39:40 - Distance Matrix Output Results43:45 - Guessing AlphaFold 2 (it's Transformers)53:30 - Conclusion & CommentsAlphaFold 2 Blog: https://deepmind.com/blog/article/alp...AlphaFold 1 Blog: https://deepmind.com/blog/article/Alp...AlphaFold 1 Paper: https://www.nature.com/articles/s4158...MSA Reference: https://arxiv.org/abs/1211.1281CASP14 Challenge: https://predictioncenter.org/casp14/i...CASP14 Result Bar Chart: https://www.predictioncenter.org/casp...Paper Title: High Accuracy Protein Structure Prediction Using Deep LearningAbstract:Proteins are essential to life, supporting practically all its functions. They are large complex molecules, made up of chains of amino acids, and what a protein does largely depends on its unique 3D structure. Figuring out what shapes proteins fold into is known as the “protein folding problem”, and has stood as a grand challenge in biology for the past 50 years. In a major scientific advance, the latest version of our AI system AlphaFold has been recognised as a solution to this grand challenge by the organisers of the biennial Critical Assessment of protein Structure Prediction (CASP). This breakthrough demonstrates the impact AI can have on scientific discovery and its potential to dramatically accelerate progress in some of the most fundamental fields that explain and shape our world.Authors: John Jumper, Richard Evans, Alexander Pritzel, Tim Green, Michael Figurnov, Kathryn Tunyasuvunakool, Olaf Ronneberger, Russ Bates, Augustin Žídek, Alex Bridgland, Clemens Meyer, Simon A A Kohl, Anna Potapenko, Andrew J Ballard, Andrew Cowie, Bernardino Romera-Paredes, Stanislav Nikolov, Rishub Jain, Jonas Adler, Trevor Back, Stig Petersen, David Reiman, Martin Steinegger, Michalina Pacholska, David Silver, Oriol Vinyals, Andrew W Senior, Koray Kavukcuoglu, Pushmeet Kohli, Demis Hassabis.
Constructing agents with planning capabilities has long been one of the main challenges in the pursuit of artificial intelligence. Tree-based planning methods have enjoyed huge success in challenging domains, such as chess1 and Go2, where a perfect simulator is available. However, in real-world problems, the dynamics governing the environment are often complex and unknown. Here we present the MuZero algorithm, which, by combining a tree-based search with a learned model, achieves superhuman performance in a range of challenging and visually complex domains, without any knowledge of their underlying dynamics. The MuZero algorithm learns an iterable model that produces predictions relevant to planning: the action-selection policy, the value function and the reward. When evaluated on 57 different Atari games3—the canonical video game environment for testing artificial intelligence techniques, in which model-based planning approaches have historically struggled4—the MuZero algorithm achieved state-of-the-art performance. When evaluated on Go, chess and shogi—canonical environments for high-performance planning—the MuZero algorithm matched, without any knowledge of the game dynamics, the superhuman performance of the AlphaZero algorithm5 that was supplied with the rules of the game.
MuZero is trained only on data generated by MuZero itself; no external data were used to produce the results presented in the article. Data for all figures and tables presented are available in JSON format in the Supplementary Information.
Vaccines are just the start. It's also capable in theory of curing almost anything. Turns medicine into a software & simulation problem.