The Naked Scientists
  • Login
  • Register
  • Podcasts
      • The Naked Scientists
      • eLife
      • Naked Genetics
      • Naked Astronomy
      • In short
      • Naked Neuroscience
      • Ask! The Naked Scientists
      • Question of the Week
      • Archive
      • Video
      • SUBSCRIBE to our Podcasts
  • Articles
      • Science News
      • Features
      • Interviews
      • Answers to Science Questions
  • Get Naked
      • Donate
      • Do an Experiment
      • Science Forum
      • Ask a Question
  • About
      • Meet the team
      • Our Sponsors
      • Site Map
      • Contact us

User menu

  • Login
  • Register
  • Home
  • Help
  • Search
  • Tags
  • Member Map
  • Recent Topics
  • Login
  • Register
  1. Naked Science Forum
  2. On the Lighter Side
  3. New Theories
  4. How close are we from building a virtual universe?
« previous next »
  • Print
Pages: [1] 2 3 ... 19   Go Down

How close are we from building a virtual universe?

  • 372 Replies
  • 44495 Views
  • 5 Tags

0 Members and 1 Guest are viewing this topic.

Offline hamdani yusuf (OP)

  • Naked Science Forum King!
  • ******
  • 4640
  • Activity:
    74%
  • Thanked: 181 times
    • View Profile
How close are we from building a virtual universe?
« on: 21/09/2019 09:50:36 »
This thread is another spinoff from my earlier thread called universal utopia. This time I try to attack the problem from another angle, which is information theory point of view.
I have started another thread related to this subject  asking about quantification of accuracy and precision. It is necessary for us to be able to make comparison among available methods to describe some aspect of objective reality, and choose the best option based on cost and benefit consideration. I thought it was already a common knowledge, but the course of discussion shows it wasn't the case. I guess I'll have to build a new theory for that. It's unfortunate that the thread has been removed, so new forum members can't explore how the discussion developed.
In my professional job, I have to deal with process control and automation, engineering and maintenance of electrical and instrumentation systems. It's important for us to explore the leading technologies and use them for our advantage to survive in the fierce industrial competition during this industrial revolution 4.0. One of the technology which is closely related to this thread is digital twin.
https://en.m.wikipedia.org/wiki/Digital_twin

Just like my other spinoff discussing about universal morality, which can be reached by expanding the groups who develop their own subjective morality to the maximum extent permitted by logic, here I also try to expand the scope of the virtualization of real world objects like digital twin in industrial sector to cover other fields as well. Hopefully it will lead us to global governance, because all conscious beings known today share the same planet. In the future the scope needs to expand even further because the exploration of other planets and solar systems is already on the way.

Logged
Unexpected results come from false assumptions.
 



Offline jeffreyH

  • Global Moderator
  • Naked Science Forum King!
  • ********
  • 7002
  • Activity:
    0%
  • Thanked: 191 times
  • The graviton sucks
    • View Profile
Re: How close are we from building a virtual universe?
« Reply #1 on: 21/09/2019 11:06:32 »
How detailed should the virtual universe be? Does it only include the observable universe? Depending upon the detail and scale it could require more information to describe it than the universe actually contains.

A better model would study a well defined region of the universe such as a galaxy cluster. However, this would still depend upon the level of detail.
Logged
Even the most obstinately ignorant cannot avoid learning when in an environment that educates.
 

Offline evan_au

  • Global Moderator
  • Naked Science Forum GOD!
  • ********
  • 10244
  • Activity:
    32%
  • Thanked: 1229 times
    • View Profile
Re: How close are we from building a virtual universe?
« Reply #2 on: 21/09/2019 22:37:58 »
There is a definite tradeoff between level of detail, computer power and memory storage.

If you have a goal of studying the general shape of the universe, it is important to have dark matter and normal matter (which clumps into galaxies). But modelling individual stars is not needed.

If you are studying the shape of the galaxy, you don't need to model the lifecycle of the individual stars.

If you are studying the orbits of the planets around the Sun, you don't need to model whether or not Earth hosts life.
Logged
 

Offline hamdani yusuf (OP)

  • Naked Science Forum King!
  • ******
  • 4640
  • Activity:
    74%
  • Thanked: 181 times
    • View Profile
Re: How close are we from building a virtual universe?
« Reply #3 on: 22/09/2019 04:12:47 »
Building a virtual universe is just an instrumental goal, which is meant to increase the chance to achieve the terminal goal by improving the effectiveness and efficiency of our actions. I discussed these goals in another thread about universal morality. Here I want to discuss more about technical issues.
Efforts for virtualization of objective reality has already started, but currently they are mostly partial, either by location or function. Some examples are google map, ERP software, online banking, e-commerce, wikipedia, crypto currency, CAD software, SCADA, social media.
Their lack of integration may lead to data duplication when different systems are representing the same object from different point of view. When they are not updated simultaneously, they will produce inconsistency, which may lead to incorrect decision makings.
« Last Edit: 23/09/2019 04:24:37 by hamdani yusuf »
Logged
Unexpected results come from false assumptions.
 

Offline hamdani yusuf (OP)

  • Naked Science Forum King!
  • ******
  • 4640
  • Activity:
    74%
  • Thanked: 181 times
    • View Profile
Re: How close are we from building a virtual universe?
« Reply #4 on: 22/09/2019 04:23:45 »
The level of detail can vary, depends on the significance of the object. In google earth, big cities might be zoomed to less than 1 meter per pixel, while deserts or oceans have much coarser detail.
Logged
Unexpected results come from false assumptions.
 



Offline jeffreyH

  • Global Moderator
  • Naked Science Forum King!
  • ********
  • 7002
  • Activity:
    0%
  • Thanked: 191 times
  • The graviton sucks
    • View Profile
Re: How close are we from building a virtual universe?
« Reply #5 on: 22/09/2019 13:56:17 »
Quote from: hamdani yusuf on 22/09/2019 04:12:47
Building a virtual universe is just an instrumental goal, which is meant to increase the chance to achieve the terminal goal by improving the effectiveness and efficiency of our actions. I discussed these goals in another thread about universal morality. Here I want to discuss more about technical issues.
Efforts for virtualization of objective reality has already started, but currently they are mostly partial, either by location or function. Some examples are google map, ERP software, online banking, e-commerce, wikipedia, crypto currency, CAD software, SCADA.
Their lack of integration may lead to data duplication when different systems are representing the same object from different point of view. When they are not updated simultaneously, they will produce inconsistency, which may lead to incorrect decision makings.

You are talking about disparate systems. They are also human centric and not universe centric.
Logged
Even the most obstinately ignorant cannot avoid learning when in an environment that educates.
 

Offline hamdani yusuf (OP)

  • Naked Science Forum King!
  • ******
  • 4640
  • Activity:
    74%
  • Thanked: 181 times
    • View Profile
Re: How close are we from building a virtual universe?
« Reply #6 on: 23/09/2019 04:34:54 »
Quote from: jeffreyH on 22/09/2019 13:56:17
Quote from: hamdani yusuf on 22/09/2019 04:12:47
Building a virtual universe is just an instrumental goal, which is meant to increase the chance to achieve the terminal goal by improving the effectiveness and efficiency of our actions. I discussed these goals in another thread about universal morality. Here I want to discuss more about technical issues.
Efforts for virtualization of objective reality has already started, but currently they are mostly partial, either by location or function. Some examples are google map, ERP software, online banking, e-commerce, wikipedia, crypto currency, CAD software, SCADA.
Their lack of integration may lead to data duplication when different systems are representing the same object from different point of view. When they are not updated simultaneously, they will produce inconsistency, which may lead to incorrect decision makings.

You are talking about disparate systems. They are also human centric and not universe centric.
They are disparate now, but there are already efforts to integrate them. Some ERP systems have been connected to Plant Information Management System, which in turn can be connected to SCADA, DCS, PLC, and even smart field devices, such as transmitter, control valve positioners and variable speed drives.
What we need is a common platform to store those information in the same or compatible format, so any update in one subsystem can be automatically update in related subsystems to guarantee data integrity. The common platform must also take care of user accountability and data accessibility.
« Last Edit: 24/09/2019 11:17:34 by hamdani yusuf »
Logged
Unexpected results come from false assumptions.
 

Offline hamdani yusuf (OP)

  • Naked Science Forum King!
  • ******
  • 4640
  • Activity:
    74%
  • Thanked: 181 times
    • View Profile
Re: How close are we from building a virtual universe?
« Reply #7 on: 24/09/2019 11:20:57 »
Building a virtualization of objective reality in high precision takes a lot of resources in the form of data storage and communication bandwith. Hence the system needs to maximize information density.
Logged
Unexpected results come from false assumptions.
 

Offline hamdani yusuf (OP)

  • Naked Science Forum King!
  • ******
  • 4640
  • Activity:
    74%
  • Thanked: 181 times
    • View Profile
Re: How close are we from building a virtual universe?
« Reply #8 on: 10/10/2019 14:20:30 »
My basic idea for building a virtual universe is representing physical objects as nodes which are then organized in hierarchical structure. It is like a Unix feature, where everything is a file. But here, everything is a node.
To address is-ought problem, another hierarchical structure is created to represent desired/designed conditions.
A relationship table is created to show assignment of physical objects to designed objects. It also saves additional relationship types between them if necessary. Another relationship tables are added to show relationships among nodes other than the main hierarchical structures.
Another hierarchical structure is created to represent activities/events, which are basically any changes of nodes in those hierarchical structures of physical and desired objects. The activity nodes have timestamps for start and finish.
I have built a prototype for this system based on a DCS configuration database, which are then expanded to accomodate other things beyond I/O assignments, physical network, and control strategies.
Logged
Unexpected results come from false assumptions.
 



Offline hamdani yusuf (OP)

  • Naked Science Forum King!
  • ******
  • 4640
  • Activity:
    74%
  • Thanked: 181 times
    • View Profile
Re: How close are we from building a virtual universe?
« Reply #9 on: 23/01/2020 07:10:31 »
The universe as we know it is a dynamic system, which means it changes in time.  So, for a virtual universe to be useful, it also needs to be a dynamic system. Static systems such as paper maps or ancient human's cave painting can only have limited usage for narrow purposes.
Logged
Unexpected results come from false assumptions.
 

Offline Bored chemist

  • Naked Science Forum GOD!
  • *******
  • 27230
  • Activity:
    100%
  • Thanked: 910 times
    • View Profile
Re: How close are we from building a virtual universe?
« Reply #10 on: 23/01/2020 07:16:04 »
There is a virtual universe in your head.
Logged
Please disregard all previous signatures.
 

Offline hamdani yusuf (OP)

  • Naked Science Forum King!
  • ******
  • 4640
  • Activity:
    74%
  • Thanked: 181 times
    • View Profile
Re: How close are we from building a virtual universe?
« Reply #11 on: 23/01/2020 07:21:49 »
Quote from: hamdani yusuf on 24/09/2019 11:20:57
Building a virtualization of objective reality in high precision takes a lot of resources in the form of data storage and communication bandwith. Hence the system needs to maximize information density.
Here is an interesting excerpts from Ray Kurzweil's book "Singularity Is Near" regarding order and complexity, which are closely related to information density.
Quote
Not surprisingly, the concept of complexity is complex. One concept of complexity is the minimum amount of
information required to represent a process. Let's say you have a design for a system (for example, a computer
program or a computer-assisted design file for a computer), which can be described by a data file containing one
million bits. We could say your design has a complexity of one million bits. But suppose we notice that the one million
bits actually consist of a pattern of one thousand bits that is repeated one thousand times. We could note the
repetitions, remove the repeated patterns, and express the entire design in just over one thousand bits, thereby reducing
the size of the file by a factor of about one thousand.
The most popular data-compression techniques use similar methods of finding redundancy within information.3
But after you've compressed a data file in this way, can you be absolutely certain that there are no other rules or
methods that might be discovered that would enable you to express the file in even more compact terms? For example,
suppose my file was simply "pi" (3.1415...) expressed to one million bits of precision. Most data-compression
programs would fail to recognize this sequence and would not compress the million bits at all, since the bits in a binary
expression of pi are effectively random and thus have no repeated pattern according to all tests of randomness.
But if we can determine that the file (or a portion of the file) in fact represents pi, we can easily express it (or that
portion of it) very compactly as "pi to one million bits of accuracy." Since we can never be sure that we have not
overlooked some even more compact representation of an information sequence, any amount of compression sets only
an upper bound for the complexity of the information. Murray Gell-Mann provides one definition of complexity along
these lines. He defines the "algorithmic information content" (Ale) of a set of information as "the length of the shortest
program that will cause a standard universal computer to print out the string of bits and then halt."4
However, Gell-Mann's concept is not fully adequate. If we have a file with random information, it cannot be
compressed. That observation is, in fact, a key criterion for determining if a sequence of numbers is truly random.
However, if any random sequence will do for a particular design, then this information can be characterized by a
simple instruction, such as "put random sequence of numbers here." So the random sequence, whether it's ten bits or
one billion bits, does not represent a significant amount of complexity, because it is characterized by a simple
instruction. This is the difference between a random sequence and an unpredictable sequence of information that has
purpose.
To gain some further insight into the nature of complexity, consider the complexity of a rock. If we were to
characterize all of the properties (precise location, angular momentum, spin, velocity, and so on) of every atom in the
rock, we would have a vast amount of information. A one-kilogram (2.2-pound) rock has 1025 atoms which, as I will
discuss in the next chapter, can hold up to 1027 bits of information. That's one hundred million billion times more
information than the genetic code of a human (even without compressing the genetic code).5 But for most common
purposes, the bulk of this information is largely random and of little consequence. So we can characterize the rock for
most purposes with far less information just by specifying its shape and the type of material of which it is made. Thus,
it is reasonable to consider the complexity of an ordinary rock to be far less than that of a human even though the rock
theoretically contains vast amounts of information.6
One concept of complexity is the minimum amount of meaningful, non-random, but unpredictable information
needed to characterize a system or process.
In Gell-Mann's concept, the AlC of a million-bit random string would be about a million bits long. So I am adding
to Gell-Mann's AlC concept the idea of replacing each random string with a simple instruction to "put random bits"
here.
However, even this is not sufficient. Another issue is raised by strings of arbitrary data, such as names and phone
numbers in a phone book, or periodic measurements of radiation levels or temperature. Such data is not random, and
data-compression methods will only succeed in reducing it to a small degree. Yet it does not represent complexity as
that term is generally understood. It is just data. So we need another simple instruction to "put arbitrary data sequence"
here.
To summarize my proposed measure of the complexity of a set of information, we first consider its AlC as Gell-
Mann has defined it. We then replace each random string with a simple instruction to insert a random string. We then
do the same for arbitrary data strings. Now we have a measure of complexity that reasonably matches our intuition.
Logged
Unexpected results come from false assumptions.
 

Offline hamdani yusuf (OP)

  • Naked Science Forum King!
  • ******
  • 4640
  • Activity:
    74%
  • Thanked: 181 times
    • View Profile
Re: How close are we from building a virtual universe?
« Reply #12 on: 23/01/2020 07:25:54 »
Quote from: Bored chemist on 23/01/2020 07:16:04
There is a virtual universe in your head.
Indeed, but it only covers a small portion of even the currently observable universe. A lot of information that I had ever known has already lost. In order to be useful for predicting events in the far future, we need a much larger and complex system.
Logged
Unexpected results come from false assumptions.
 



Offline hamdani yusuf (OP)

  • Naked Science Forum King!
  • ******
  • 4640
  • Activity:
    74%
  • Thanked: 181 times
    • View Profile
Re: How close are we from building a virtual universe?
« Reply #13 on: 24/01/2020 08:22:35 »
Regarding the original question, it turns out that Ray Kurzweil has already predict the answer, which is around mid of this century.

Quote
Kurzweil claims that technological progress follows a pattern of exponential growth, following what he calls the "law of accelerating returns". Whenever technology approaches a barrier, Kurzweil writes, new technologies will surmount it. He predicts paradigm shifts will become increasingly common, leading to "technological change so rapid and profound it represents a rupture in the fabric of human history".[39] Kurzweil believes that the singularity will occur by approximately 2045.[40] His predictions differ from Vinge's in that he predicts a gradual ascent to the singularity, rather than Vinge's rapidly self-improving superhuman intelligence.
https://en.wikipedia.org/wiki/Technological_singularity#Accelerating_change
Logged
Unexpected results come from false assumptions.
 

Offline hamdani yusuf (OP)

  • Naked Science Forum King!
  • ******
  • 4640
  • Activity:
    74%
  • Thanked: 181 times
    • View Profile
Re: How close are we from building a virtual universe?
« Reply #14 on: 03/02/2020 11:02:21 »
Objective reality contains a lot of objects with complex relationships among them. Hence to build a virtual universe we must use a method capable of storing data to represent the complex system. The obvious choice is using graphs, which are a mathematical structures used to model pairwise relations between objects. A graph in this context is made up of vertices (also called nodes or points) which are connected by edges (also called links or lines).

https://en.wikipedia.org/wiki/Graph_theory
Logged
Unexpected results come from false assumptions.
 

Offline hamdani yusuf (OP)

  • Naked Science Forum King!
  • ******
  • 4640
  • Activity:
    74%
  • Thanked: 181 times
    • View Profile
Re: How close are we from building a virtual universe?
« Reply #15 on: 28/02/2020 10:02:07 »
https://www.technologyreview.com/s/615189/what-ai-still-cant-do/
Quote
Artificial intelligence won’t be very smart if computers don’t grasp cause and effect. That’s something even humans have trouble with.

In less than a decade, computers have become extremely good at diagnosing diseases, translating languages, and transcribing speech. They can outplay humans at complicated strategy games, create photorealistic images, and suggest useful replies to your emails.
Yet despite these impressive achievements, artificial intelligence has glaring weaknesses.

Machine-learning systems can be duped or confounded by situations they haven’t seen before. A self-driving car gets flummoxed by a scenario that a human driver could handle easily. An AI system laboriously trained to carry out one task (identifying cats, say) has to be taught all over again to do something else (identifying dogs). In the process, it’s liable to lose some of the expertise it had in the original task. Computer scientists call this problem “catastrophic forgetting.”

These shortcomings have something in common: they exist because AI systems don’t understand causation. They see that some events are associated with other events, but they don’t ascertain which things directly make other things happen. It’s as if you knew that the presence of clouds made rain likelier, but you didn’t know clouds caused rain.
Quote
AI can’t be truly intelligent until it has a rich understanding of cause and effect, which would enable the introspection that is at the core of cognition.
Judea Pearl
A virtual universe can map commonly known cause and effect relationships to be used as library by AI agents, which will save a lot of time training them from the beginning everytime a new AI agent is assigned.
Logged
Unexpected results come from false assumptions.
 

Offline hamdani yusuf (OP)

  • Naked Science Forum King!
  • ******
  • 4640
  • Activity:
    74%
  • Thanked: 181 times
    • View Profile
Re: How close are we from building a virtual universe?
« Reply #16 on: 09/03/2020 07:02:22 »
To achieve generality, an AI is required to adapt to various range of situations. It would be better to have modular structure for frequently used basic functions similar to the configuration of naturally occured brains. It must have some flexibility upon its own hyperparameters, which might require changes for executing different tasks.
To maintain its own integrity, and fight off data corruption or cyber attacks, the AI needs to spend some of its data storage and processing capacity to represent its own structure. This will create some sort of self awareness, which is a step towards artificial consciousness.
Logged
Unexpected results come from false assumptions.
 



Offline hamdani yusuf (OP)

  • Naked Science Forum King!
  • ******
  • 4640
  • Activity:
    74%
  • Thanked: 181 times
    • View Profile
Re: How close are we from building a virtual universe?
« Reply #17 on: 14/04/2020 09:54:24 »
The article below has reminded me once again of the importance of having a universal modelling system/platform.
Quote
COBOL, a 60-year-old computer language, is in the COVID-19 spotlight
As state governments seek to fix overwhelmed unemployment benefit systems, they need programmers skilled in a language that was passé by the early 1980s.

Some states have found themselves in need of people who know a 60-year-old programming language called COBOL to retrofit the antiquated government systems now struggling to process the deluge of unemployment claims brought by the coronavirus crisis.

The states of Kansas, New Jersey, and Connecticut all experienced technical meltdowns after a stunning 6.6 million Americans filed for unemployment benefits last week.

They might not have an easy time finding the programmers they need. There just aren’t that many people around these days who know COBOL, or Common Business-Oriented Language. Most universities stopped teaching the language back in the 1980s. COBOL is considered a relic by younger coders.

“There’s really no good reason to learn COBOL today, and there was really no good reason to learn it 20 years ago,” says UCLA computer science professor Peter Reiher. “Most students today wouldn’t have ever even heard of COBOL.”

Meanwhile, because many banks, large companies, and government agencies still use the language in their legacy systems, there’s plenty of demand for COBOL programmers. A search for “COBOL Developer” returned 568 jobs on Indeed.com. COBOL developers make anywhere from $40 to more than $100 per hour.

Kansas governor Laura Kelley said the Kansas Department of Labor was in the process of migrating systems from COBOL to a newer language, but that the effort was postponed by the virus. New Jersey governor Phil Murphy wondered why such an old language was being used on vital state government systems, and classed it with the many weaknesses in government systems the virus has revealed.

The truth is, organizations often hesitate to change those old systems because they still work, and migrating to new systems is expensive. Massive upgrades also involve writing new code, which may contain bugs, Reiher says. In the worst-case scenario, bugs might cause the loss of customer financial data being moved from the old system to the new.
IT STILL WORKS (MOSTLY)
COBOL, though ancient, is still considered stable and reliable—at least under normal conditions.

The current glitches with state unemployment problems are “probably not a specific flaw in the COBOL language or in the underlying implementation,” Reiher says. “The problem is more likely that some states are asking their computer systems to work with data on a far higher scale, he said, and making the systems do things they’ve never been asked to do.”

COBOL was developed in the early 1960s by computer scientists from universities, mainframe manufacturers, the defense and banking industries, and government. Based on ideas developed by programming pioneer Grace Hopper, it was driven by the need for a language that could run on a variety of different kinds of mainframes.

“It was developed to do specific kinds of things like inventory and payroll and accounts receivable,” Reiher told me. “It was widely used in 1960s by a lot of banks and government agencies when they first started automating their systems.”

Here in the 21st century, COBOL is still quietly doing those kinds of things. Millions of lines of COBOL code still run on mainframes used in banks and a number of government agencies, including the Department of Veterans Affairs, Department of Justice, and Social Security Administration. A 2017 Reuters report said 43% of banking systems still use COBOL.

But the move to newer languages such as Java, C, and Python is making its way through industries of all sorts, and will eventually be used in new systems used by banks and government. One key reason for the migration is that mobile platforms use newer languages, and they rely on tight integration with underlying systems to work the way users expect.

The coronavirus will be a catalyst for a lot of changes in the coming years, some good, some bad. The migration away from the programming languages of another era may be one of the good ones.

https://www.fastcompany.com/90488862/what-is-cobol

My previous job as a system integrator has given me first hand experience on this issue. Most of the projects I handeld were migration from an old/obsolete system to a newer one (mostly DCS). The most obvious advantage of these projects is that we have a system that is still working. The challenge that we had was translating the source code of the old systems into the new one. When they couldn't be translated as one to one correspondence, we need to use process control narration as intermediary. Often times we couldn't get access to the source code due to the oldness of the system, missing parts of documentation such as hardcopy of ladder diagram, function block diagram, sequential function chart, proprietary scripts, or due to corrupted floppy disks. So we had to rely on additional information provided by the process operators and supervisors about how the system was supposed to work.
On the other hand, in new systems we don't have the source code, so we have to translate from the control narrations provided by the process engineers. There is no guarantee that the system will work as intended. Often times, we had to make tweaking, adjustments, even some major modifications during the project commissioning.
If only we had a universal modelling system/platform, we could save a lot of time and effort to finish the projects. The system migrations could then be done automatically.
« Last Edit: 14/04/2020 11:07:09 by hamdani yusuf »
Logged
Unexpected results come from false assumptions.
 

Offline hamdani yusuf (OP)

  • Naked Science Forum King!
  • ******
  • 4640
  • Activity:
    74%
  • Thanked: 181 times
    • View Profile
Re: How close are we from building a virtual universe?
« Reply #18 on: 28/04/2020 05:29:32 »
The progress to build better AI and toward AGI will eventually get closer to the realization of Laplace demon which is already predicted as technological singularity.
Quote
The better we can predict, the better we can prevent and pre-empt. As you can see, with neural networks, we’re moving towards a world of fewer surprises. Not zero surprises, just marginally fewer. We’re also moving toward a world of smarter agents that combine neural networks with other algorithms like reinforcement learning to attain goals.
https://pathmind.com/wiki/neural-network
Quote
In some circles, neural networks are thought of as “brute force” AI, because they start with a blank slate and hammer their way through to an accurate model. They are effective, but to some eyes inefficient in their approach to modeling, which can’t make assumptions about functional dependencies between output and input.

That said, gradient descent is not recombining every weight with every other to find the best match – its method of pathfinding shrinks the relevant weight space, and therefore the number of updates and required computation, by many orders of magnitude. Moreover, algorithms such as Hinton’s capsule networks require far fewer instances of data to converge on an accurate model; that is, present research has the potential to resolve the brute force nature of deep learning.
Logged
Unexpected results come from false assumptions.
 

Offline hamdani yusuf (OP)

  • Naked Science Forum King!
  • ******
  • 4640
  • Activity:
    74%
  • Thanked: 181 times
    • View Profile
Re: How close are we from building a virtual universe?
« Reply #19 on: 28/07/2020 05:52:53 »
This Is What Tesla's Autopilot Sees On The Road.

Essentially, it builds a virtual environment in its computer based on input data from visual cameras and radar. With more of autopilot cars on the road, a lot of data being processed become redundant. Sharing those data can be the next step to increase efficiency of the whole system. It will require agreed protocol, data structure, and algorithm to interpret them properly. This brings us one step closer to a virtual universe.
Logged
Unexpected results come from false assumptions.
 



  • Print
Pages: [1] 2 3 ... 19   Go Up
« previous next »
Tags: virtual universe  / amazing technologies  / singularity  / future science  / conection 
 
There was an error while thanking
Thanking...
  • SMF 2.0.15 | SMF © 2017, Simple Machines
    Privacy Policy
    SMFAds for Free Forums
  • Naked Science Forum ©

Page created in 0.096 seconds with 73 queries.

  • Podcasts
  • Articles
  • Get Naked
  • About
  • Contact us
  • Advertise
  • Privacy Policy
  • Subscribe to newsletter
  • We love feedback

Follow us

cambridge_logo_footer.png

©The Naked Scientists® 2000–2017 | The Naked Scientists® and Naked Science® are registered trademarks created by Dr Chris Smith. Information presented on this website is the opinion of the individual contributors and does not reflect the general views of the administrators, editors, moderators, sponsors, Cambridge University or the public at large.