Communities

Writing
Writing
Codidact Meta
Codidact Meta
The Great Outdoors
The Great Outdoors
Photography & Video
Photography & Video
Scientific Speculation
Scientific Speculation
Cooking
Cooking
Electrical Engineering
Electrical Engineering
Judaism
Judaism
Languages & Linguistics
Languages & Linguistics
Software Development
Software Development
Mathematics
Mathematics
Christianity
Christianity
Code Golf
Code Golf
Music
Music
Physics
Physics
Linux Systems
Linux Systems
Power Users
Power Users
Tabletop RPGs
Tabletop RPGs
Community Proposals
Community Proposals
tag:snake search within a tag
answers:0 unanswered questions
user:xxxx search by author id
score:0.5 posts with 0.5+ score
"snake oil" exact phrase
votes:4 posts with 4+ votes
created:<1w created < 1 week ago
post_type:xxxx type of post
Search help
Notifications
Mark all as read See all your notifications »
Q&A

Is there an expected upper bound on the processing abilities of quantum computers?

+1
−0

A computer can store some number $n$ of bits in memory, and it can perform some number $m$ of essential computational operations, such as addition or multiplication, and rewrite those bits to memory, per some unit of time, $t$.

To give a rough estimate, consider a modern digital computer with 1 terabyte of memory, and a CPU clockrate of 10GHz (current world records, using overclocking, are approaching this number). This translates to $1 \times 10^{12} \times 8$ bits in memory (longterm storage), and $10 \times 10^{9}$ cycles per second. (Unfortunately, I don’t know the significance of “cycles”, as opposed to what I tried to describe above, “a unit of calculation”, let us say.)

That means our CPU can (conjecturally) modify $10^{10}$ bits per second (10 billion), and stores $8 \times 10^{12}$ bits total (8 trillion).

It seems that the current capacity for storage far exceeds the current capability for computational speed, so the limiting factor on how large of some kind of data model of a system can be will be time. Let’s assume we have enough storage to perform any kind of computational simulation that would complete in a “reasonable” amount of time. For the sake of obtaining useful knowledge, I’ll cap this at 10 years.

This means that because we can compute 10 billion bits per second, in 10 years, we could process $10 \times 10^{9} \times 60 \times 60 \times 24 \times 365.25 \times 10$ which is roughly $3.15 \times 10^{18}$, or about 3 quintillion. My estimation here may be inaccurate.

What kind of system can be fully modeled in 3 quintillion computational operations?

Imagine modeling some subsystem of the homo sapiens organism/body. If we go as low as subatomic physics, each subatomic particle is comprised of roughly 3 parameters taking rational values: mass, spin, and charge. (There are 17 known such particles.)

To represent a rational number in $n$ decimal places, imagine it as a sequence, or linked list, of digits. The 10 distinct digits can be represented in binary with 4 bits ($2^{4} = 16$). Then say an $n$-decimal place precise rational number requires $16 \times n$ bits. Let’s assume we will represent rational numbers to 10 decimals points of precision. Then to represent the data of any subatomic particle, we need $3 \times 16 \times 10$ bits, or 480 bits.

To update the data of a subatomic particle, I am going to keep my estimation simple and assume we may have to perform one computation on each bit. We can process 3 quintillion bits in 10 years, meaning we could update the state of $\frac{3 \times 10^{18}}{480} = 6.25 \times 10^{15}$ subatomic particles in 10 years, or about 6 quadrillion.

The human body is composed of cells, which are composed of organic molecules, which are composed of atoms, which are composed of subatomic particles. Let’s estimate that the average organic molecule is composed of 10 atoms (some more, some less), and the average atom is composed of 20 protons, neutrons, and electrons each. A proton and neutron each are made of 3 subatomic particles, and an electron is a subatomic particle. Then each atom is on average made of $20 \times 3 \times 20 \times 3 \times 20 = 140$ subatomic particles, and each organic molecule of 1400 subatomic particles. I will guess that each cell is on average made of 1,000 organic molecules. Then each cell can be modeled by 1,400,000 subatomic particles, which can be modeled by $6.72 \times 10^{8}$ bits, or 6.7 hundred billion.

Then the state of each cell can be updated with 670 billion computational operations, so in 10 years, we can update the state of a cell 5 quintillion divided by 670 billion times, or about 7.44 million. This means we could update the state of a single cell 7.44 million times, or of 7.44 million cells a single time, for example. If we wanted to find an equal proportion of number of cells and times updated, it would be about 2,727 cells, updated 2,727 times.

According to my estimate, a modern computer could only fully model about 3000 cells updating their state about 3000 times, and it would take about ten years. Of course, this estimate needs more work.

How does this translate into quantum computing? The processing power of a quantum computer scales exponentially, so you need far fewer quantum circuits before you can effectively model the entire universe. Is there a realistic bound in the near future expected for what kinds of systems can be completely modeled by quantum computers? Could a quantum computer model an entire homo sapiens body in the atomic level, in a usable time frame?

History
Why does this post require moderator attention?
You might want to add some details to your flag.
Why should this post be closed?

0 comment threads

1 answer

+0
−1

There isn't a definitive answer on the upper bound of quantum computing's processing abilities, but here's what we know:

Classical Computers vs. Quantum Computers:

  • Regular computers: Their processing power increases linearly with more bits. Doubling the bits doubles the calculations done at once.
  • Quantum computers: They leverage qubits, which can be 0, 1, or both simultaneously (superposition). This allows for exponential processing power increases with more qubits.

Theoretical Potential:

  • In theory, a quantum computer with enough qubits could tackle problems impossible for classical computers, including complex simulations.

Challenges and Limitations:

  • Building large, stable quantum computers is incredibly difficult.
  • Not all problems benefit from a quantum approach. Some tasks are better suited for classical computers.

Modeling the Human Body:

  • Your estimation highlights the vast number of calculations needed to model even a single cell.
  • Quantum computers have the potential to significantly improve this, but there are limitations:
    • Maintaining the delicate quantum state of qubits is challenging, especially with complex simulations.
    • Even with faster processing, accurately modeling a system as intricate as the human body might require more qubits than we can realistically manage in the near future.

The Future:

  • Quantum computing is a rapidly evolving field. Predicting a realistic bound for its capabilities in the near future is difficult.
  • Modeling an entire human body at the atomic level in a usable timeframe might be beyond the reach of even future quantum computers. However, quantum computers could revolutionize our understanding of biological processes by tackling smaller, more manageable parts of the system.

In summary:

Quantum computers have the potential to break barriers in simulation, but there are technical hurdles to overcome. While modeling an entire human body might be a distant dream, these machines hold immense potential for scientific breakthroughs in the years to come.

History
Why does this post require moderator attention?
You might want to add some details to your flag.

0 comment threads

Sign up to answer this question »