Connect with us

The Future

Here is the world’s first robotics museum–built by robots, of course The first exhibition of Seoul’s Robot Museum will be the robots building the museum itself.

Published

on

Seoul wants to have the world’s very first museum dedicated to robotic science. And the city authorities have decided on the best possible way to build it: use robots, of course.

The museum, designed by Turkish architectural firm Melike Altınışık, is designed to be one of the most recognizable buildings in the center of the Changbai New Economic Center, a newly redeveloped area in the center of northern part of the city.

Its organic form, a semi-sphere that seems to flow in waves to reveal a glass and steel base, will be built by robots. According to the firm’s design principal Melike Altınışık, the building has been conceived as a temple to robotic innovation, so the best way they could materialize that ethos was by using robotic arms to assemble the new space.

First, a team of robots will mold the curved metal plates that form the museum sphere using a 3D building information modeling system (basically a CAD system that works with solid objects in real 3D space rather than represent the objects with 2D plans). Robots will assemble the plates, welding and polishing the metal to obtain its final surface appearance.

Then another team of robots, the architectural firm says, will 3D-print concrete to build the public area surrounding the museum.

This process will start in early 2020, with the museum opening its doors about two years after that.

My only question is: Are they using robots to build the robot builders, and, if so, who will build the robots that build those robots and would this infinity loop cause a tear in the space-time continuum that will suck the entire museum into a black hole?

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

The Future

These 3 Computing Technologies Will Beat Moore’s Law

Published

on

By

There’s a big lie about disruption going around. And folks aren’t spreading it intentionally.

Many smart investors I talk to genuinely believe it to be the truth.

If you accept this widespread lie, you’ll likely make poor decisions when investing in disruptive companies.

Here, I’ll explain the real truth and why it matters to disruption investors.

Your Smartphone Is More Powerful than an Early ‘90s Supercomputer

Your smartphone can do the job of a whole collection of gadgets.

It’s a phone, camera, camcorder, Walkman, watch, wallet, radio, global map, TV, VCR, and computer all in one.

And keep in mind, all a supercomputer does is crunch numbers. We have “Moore’s law” to thank for this.

Named after Intel founder Gordon Moore, it observes that computing power doubles roughly every two years.

This has led to exponential growth in computing power.

As you may know, exponential growth “snowballs” over time. It builds momentum and eventually leads to vertical gains, as you can see here:

uncaptioned

For the past few decades, computing power has more or less followed this path.

This Is the Driving Force Behind Moore’s Law

Moore’s Law says the number of transistors that can fit on a computer chip doubles about every two years.

Transistors allow computers to compute. The more transistors you cram onto a chip, the more computing power it has.

Again, for the past 50 years, this has more or less held true. Back in 1965, only 64 transistors fit on the world’s most complex computer chip.

More than 10 billion transistors can fit on today’s chips.

Moore’s law is responsible for many of the giant stock market gains in the past few decades.

Leaps in computing power enabled big disruptors like Apple, Microsoft, and Amazon to achieve huge gains like 50,800%, 159,900%, and 111,560%.

And along the way, the companies that make the computer chips have gotten rich, too.

Taiwan Semiconductor, Micron Technology, and Intel achieved gains of 1,014%, 3,256%, and 35,050%.

Conventional wisdom is that Moore’s law will continue to snowball. As progress gets faster and faster, you can understand why many folks think we’re headed for a tech utopia.

It’s a great story. But it’s not quite true.

Moore’s Law Will Break Down

Moore’s law isn’t really a law. Gravity is a law. Moore’s law is an observation and a forecast.

As I mentioned, since 1965, it has held true. But here’s the key…

Within the next few years, Moore’s law will break down.

You see, although today’s transistors are microscopic, they still take up physical space. There’s a limit to how small you can make anything that occupies physical space.

We are now approaching that limit with transistors. So the progress predicted by Moore’s law must slow.

In fact, Moore’s law is already slowing down. Many technologists predict it will totally break down between 2022–2025.

Does that mean progress will stop?

Not a chance.

New technologies will pick up where Moore’s law leaves off. There are three exciting computing technologies in development you should know about.

3D Computing Hits the Market Later This Year

What does a city do when it runs short on land? It builds skyscrapers.

By building “up,” you can create real estate with the footprint of a one-story building, but one that holds 100X more people.

Something similar is just getting underway in computing.

You see, the “guts” of computers have always been two dimensional. Flat computer chips sit on a flat motherboard. Nothing moves in 3D. There’s no “up” or “down” inside a computer chip.

That’s now changing. In December, Intel (INTC) introduced its new 3D chip technology. It plans to begin selling it later this year.

Tech reporters are touting it as “how Intel will beat Moore’s law.”

Chips stacked in 3D are far superior to ones placed side by side. Not only can you fit multiples of transistors in the same footprint. You can better integrate all the chip’s functions.

This shortens the distance information needs to travel. And it creates many more pathways for information to flow.

The result will be much more speed and power packed into a small space. Eventually, 3D chips could be 1,000 times faster than existing ones.

DNA Computing Is a Bit Further off, but Its Potential Is Mind-Boggling

DNA carries the instructions that enable life.

As incredible as it sounds, DNA can be used for computing. In 1994, a computer scientist at the University of Southern California used DNA to solve a well-known mathematical problem.

One pound of DNA has the capacity to store more information than all the computers ever built.

A thumbnail-size DNA computer could theoretically be more powerful than today’s supercomputers.

I won’t get deep into the science here. DNA computing is still very early stage. But several companies, including Microsoft (MSFT), are working to push the technology forward.

Quantum Computing Could Be the Ultimate Disruption

The science behind quantum computing will bend your mind. To understand its potential, all you really need to know is this.

The basic unit of conventional computation is the bit. The more bits a computer has, the more calculations it can perform at once, and the more powerful it is.

With quantum computing, the basic unit of computation is called a quantum bit—or qubit.

Bits behave linearly. To get a 20-bit computer, you might add 2+2+2+2+2+2+2+2+2+2.

Qubits are different. Every qubit doubles computing power.

So, a 10-qubit computer could do 2x2x2x2x2x2x2x2x2x2 calculations at once, or 1,024.

A 100-qubit quantum computer could perform over 1,000 billion billion billion simultaneous calculations. Those numbers are too big for humans to comprehend.

In theory, a small quantum computer could exceed the power of a regular computer the size of the Milky Way galaxy.

With enough computing firepower, a quantum computer could solve any problem.

If we ever achieve far-out goals like controlling the weather, colonizing Mars, or reversing human aging, quantum computing will likely be the driving force.

There Are No Pure-Play Quantum Computing Stocks

They’re all private or have been scooped up by larger companies.

Many of the big tech players are developing quantum computing technology. Microsoft, IBM, Google (GOOG), and Intel are a few.

Google looks to be in the lead.

In March 2018, it unveiled its Bristlecone quantum processor, which the company thinks could achieve “quantum supremacy.”

Quantum supremacy is the “tipping point” for quantum computing. It’s the point when a quantum computer can beat a regular one in a useful task.

So far, scientists haven’t been able to crack this. But once quantum supremacy is reached, progress should take off very quickly.

This is yet another great reason to consider investing in Google.

Source: https://www.forbes.com/sites/stephenmcbride1/2019/04/23/these-3-computing-technologies-will-beat-moores-law/#d14179a37b0b

Continue Reading

The Future

NVIDIA’s Accelerated Computing Platform To Power Japan’s Fastest AI Supercomputer

Published

on

By

Tokyo Tech is in the process of building its next-generation TSUBAME supercomputer featuring NVIDIA GPU technology and the company’s Accelerated Computing Platform. TSUBAME 3.0, as the system will be known, will ultimately be used in tandem with the existing TSUBAME 2.5 system to deliver an estimated 64.3 (in aggregate) PFLOPS of AI computing horsepower.

On its own, TSUBAME 3.0, is expected to offer roughly two times the performance of its predecessor. TSUBAME 3.0 will be built around NVIDIA’s Pascal-based Tesla P100 GPUs, which are not only more efficient, but also higher-performing than previous-generation Maxwell GPUs in terms of performance per watt and performance per die area. It is estimated that TSUBAME 3.0 will deliver roughly 12.2 petaflops of double precision compute performance, which would place it among the world’s 10 fastest systems according to the most recent TOP500 list.

Tokyo Tech Supercomputer

A Rendering Of The Tokyo Tech Supercomputer. Image Credit: NVIDIA

The system’s architect, Tokyo Tech’s Satoshi Matsuoka said, “NVIDIA’s broad AI ecosystem, including thousands of deep learning and inference applications, will enable Tokyo Tech to begin training TSUBAME 3.0 immediately to help us more quickly solve some of the world’s once unsolvable problems.”

TSUBAME 3.0 is being designed with AI computation in mind, and is expected to deliver more than 47 PFLOPS of AI horsepower on its own.

“Artificial intelligence is rapidly becoming a key application for supercomputing,” said Ian Buck, vice president and general manager of Accelerated Computing at NVIDIA. “NVIDIA’s GPU computing platform merges AI with HPC, accelerating computation so that scientists and researchers can drive life-changing advances in such fields as healthcare, energy and transportation.”

TSUBAME 3.0 is expected to be completed this summer. It will be used for education and research at Tokyo Tech, and for information infrastructure for top Japanese universities, though there are plans to make the system accessible to private-sector researchers as well.

Source: https://www.forbes.com/sites/marcochiappetta/2017/02/20/nvidias-accelerated-computing-platform-to-power-japans-fastest-ai-supercomputer/#4cb509db7708

Continue Reading

The Future

When 5G is here, a wireless supercomputer will follow you around

Published

on

By

Next-generation tech like self-driving cars and augmented reality will need huge amounts of computing power.

AT&T (T) on Tuesday detailed its plan to use “edge computing” and 5G to move data processing to the cloud, in order to better support these new technologies.

“[Edge computing] is like having a wireless supercomputer follow you wherever you go,” AT&T said in a statement.

Rather than sending data to AT&T’s core data centers — which are often hundreds of miles away from customers — it will be sent to the company’s network of towers and offices, located closer to users.

Currently, data is either stored in those data centers or on the device itself.

“[Edge computing] gives the option now to put computing in more than two places,” Andre Fuetsch, president of AT&T Labs and chief technology officer, told CNN Tech.

For example, let’s say you’re wearing VR glasses but the actual virtual reality experience is running in the cloud. There could be a delay in what you see when you move your head if the data center is far away.

AT&T aims to reduce lag time by sending data to locations much closer to you. (AT&T has agreed to acquire Time Warner, the parent company of CNN. The deal is pending regulatory approval.)

5G networks will be driving these efforts. Experts believe 5G will have barely any lag, which means a lot of the computing power currently in your smartphone can be shifted to the cloud. This would extend your phone’s battery life and make apps and services more powerful.

In the case of augmented and virtual reality, superimposing digital images on top of the real world in a believable way requires a lot of processing power. Even if a smartphone can deliver that promise, it would eat up its battery life.

With edge computing, data crunching is moved from the device to the “edge” of the cloud, which is the physical points of the network that are closer to customers.

5G will also enable faster speeds and could even open the door to new robotic manufacturing and medical techniques.

AT&T is rolling out edge computing over the “next few years,” beginning in dense urban areas.

Source: https://money.cnn.com/2017/07/18/technology/att-edge-computing/index.html

Continue Reading
Advertisement

Trending

%d bloggers like this: