Communities

Writing
Writing
Codidact Meta
Codidact Meta
The Great Outdoors
The Great Outdoors
Photography & Video
Photography & Video
Scientific Speculation
Scientific Speculation
Cooking
Cooking
Electrical Engineering
Electrical Engineering
Judaism
Judaism
Languages & Linguistics
Languages & Linguistics
Software Development
Software Development
Mathematics
Mathematics
Christianity
Christianity
Code Golf
Code Golf
Music
Music
Physics
Physics
Linux Systems
Linux Systems
Power Users
Power Users
Tabletop RPGs
Tabletop RPGs
Community Proposals
Community Proposals
tag:snake search within a tag
answers:0 unanswered questions
user:xxxx search by author id
score:0.5 posts with 0.5+ score
"snake oil" exact phrase
votes:4 posts with 4+ votes
created:<1w created < 1 week ago
post_type:xxxx type of post
Search help
Notifications
Mark all as read See all your notifications »
Q&A

Post History

60%
+1 −0
Q&A Is there an expected upper bound on the processing abilities of quantum computers?

A computer can store some number $n$ of bits in memory, and it can perform some number $m$ of essential computational operations, such as addition or multiplication, and rewrite those bits to memor...

1 answer  ·  posted 9mo ago by Julius H.‭  ·  last activity 8mo ago by ariyadanesh‭

#2: Post edited by user avatar Julius H.‭ · 2024-02-25T20:03:42Z (9 months ago)
  • A computer can store some number $n$ of bits in memory, and it can perform some number $m$ of essential computational operations, such as addition or multiplication, and rewrite those bits to memory, per some unit of time, $t$.
  • To give a rough estimate, consider a modern digital computer with 1 terabyte of memory, and a CPU clockrate of 10GHz (current world records, using overclocking, are approaching this number). This translates to $1 \times 10^{12} \times 8$ bits in memory (longterm storage), and $10 \times 10^{9}$ cycles per second. (Unfortunately, I don’t know the significance of “cycles”, as opposed to what I tried to describe above, “a unit of calculation”, let us say.)
  • That means our CPU can (conjecturally) modify $10^10$ bits per second (10 billion), and stores $8 \times 10^{12}$ bits total (8 trillion).
  • It seems that the current capacity for storage far exceeds the current capability for computational speed, so the limiting factor on how large of some kind of data model of a system can be will be time. Let’s assume we have enough storage to perform any kind of computational simulation that would complete in a “reasonable” amount of time. For the sake of obtaining useful knowledge, I’ll cap this at 10 years.
  • This means that because we can compute 10 billion bits per second, in 10 years, we could process $10 \times 10^{9} \times 60 \times 60 \times 24 \times 365.25 \times 10$ which is roughly $3.15 \times 10^{18}$, or about 3 quintillion. My estimation here may be inaccurate.
  • What kind of system can be fully modeled in 3 quintillion computational operations?
  • Imagine modeling some subsystem of the homo sapiens organism/body. If we go as low as subatomic physics, each subatomic particle is comprised of roughly 3 parameters taking rational values: mass, spin, and charge. (There are 17 known such particles.)
  • To represent a rational number in $n$ decimal places, imagine it as a sequence, or linked list, of digits. The 10 distinct digits can be represented in binary with 4 bits ($2^{4} = 16$). Then say an $n$-decimal place precise rational number requires $16 \times n$ bits. Let’s assume we will represent rational numbers to 10 decimals points of precision. Then to represent the data of any subatomic particle, we need $3 \times 16 \times 10$ bits, or 480 bits.
  • To update the data of a subatomic particle, I am going to keep my estimation simple and assume we may have to perform one computation on each bit. We can process 3 quintillion bits in 10 years, meaning we could update the state of $\frac{3 \times 10^{18}}{480} = 6.25 \times 10^{15}$ subatomic particles in 10 years, or about 6 quadrillion.
  • The human body is composed of cells, which are composed of organic molecules, which are composed of atoms, which are composed of subatomic particles. Let’s estimate that the average organic molecule is composed of 10 atoms (some more, some less), and the average atom is composed of 20 protons, neutrons, and electrons each. A proton and neutron each are made of 3 subatomic particles, and an electron is a subatomic particle. Then each atom is on average made of $20 \times 3 \times 20 \times 3 \times 20 = 140$ subatomic particles, and each organic molecule of 1400 subatomic particles. I will guess that each cell is on average made of 1,000 organic molecules. Then each cell can be modeled by 1,400,000 subatomic particles, which can be modeled by $6.72 \times 10^{8}$ bits, or 6.7 hundred billion.
  • Then the state of each cell can be updated with 670 billion computational operations, so in 10 years, we can update the state of a cell 5 quintillion divided by 670 billion times, or about 7.44 million. This means we could update the state of a single cell 7.44 million times, or of 7.44 million cells a single time, for example. If we wanted to find an equal proportion of number of cells and times updated, it would be about 2,727 cells, updated 2,727 times.
  • According to my estimate, a modern computer could only fully model about 3000 cells updating their state about 3000 times, and it would take about ten years. Of course, this estimate needs more work.
  • How does this translate into quantum computing? The processing power of a quantum computer scales exponentially, so you need far fewer quantum circuits before you can effectively model the entire universe. Is there a realistic bound in the near future expected for what kinds of systems can be completely modeled by quantum computers? Could a quantum computer model an entire homo sapiens body in the atomic level, in a usable time frame?
  • A computer can store some number $n$ of bits in memory, and it can perform some number $m$ of essential computational operations, such as addition or multiplication, and rewrite those bits to memory, per some unit of time, $t$.
  • To give a rough estimate, consider a modern digital computer with 1 terabyte of memory, and a CPU clockrate of 10GHz (current world records, using overclocking, are approaching this number). This translates to $1 \times 10^{12} \times 8$ bits in memory (longterm storage), and $10 \times 10^{9}$ cycles per second. (Unfortunately, I don’t know the significance of “cycles”, as opposed to what I tried to describe above, “a unit of calculation”, let us say.)
  • That means our CPU can (conjecturally) modify $10^{10}$ bits per second (10 billion), and stores $8 \times 10^{12}$ bits total (8 trillion).
  • It seems that the current capacity for storage far exceeds the current capability for computational speed, so the limiting factor on how large of some kind of data model of a system can be will be time. Let’s assume we have enough storage to perform any kind of computational simulation that would complete in a “reasonable” amount of time. For the sake of obtaining useful knowledge, I’ll cap this at 10 years.
  • This means that because we can compute 10 billion bits per second, in 10 years, we could process $10 \times 10^{9} \times 60 \times 60 \times 24 \times 365.25 \times 10$ which is roughly $3.15 \times 10^{18}$, or about 3 quintillion. My estimation here may be inaccurate.
  • What kind of system can be fully modeled in 3 quintillion computational operations?
  • Imagine modeling some subsystem of the homo sapiens organism/body. If we go as low as subatomic physics, each subatomic particle is comprised of roughly 3 parameters taking rational values: mass, spin, and charge. (There are 17 known such particles.)
  • To represent a rational number in $n$ decimal places, imagine it as a sequence, or linked list, of digits. The 10 distinct digits can be represented in binary with 4 bits ($2^{4} = 16$). Then say an $n$-decimal place precise rational number requires $16 \times n$ bits. Let’s assume we will represent rational numbers to 10 decimals points of precision. Then to represent the data of any subatomic particle, we need $3 \times 16 \times 10$ bits, or 480 bits.
  • To update the data of a subatomic particle, I am going to keep my estimation simple and assume we may have to perform one computation on each bit. We can process 3 quintillion bits in 10 years, meaning we could update the state of $\frac{3 \times 10^{18}}{480} = 6.25 \times 10^{15}$ subatomic particles in 10 years, or about 6 quadrillion.
  • The human body is composed of cells, which are composed of organic molecules, which are composed of atoms, which are composed of subatomic particles. Let’s estimate that the average organic molecule is composed of 10 atoms (some more, some less), and the average atom is composed of 20 protons, neutrons, and electrons each. A proton and neutron each are made of 3 subatomic particles, and an electron is a subatomic particle. Then each atom is on average made of $20 \times 3 \times 20 \times 3 \times 20 = 140$ subatomic particles, and each organic molecule of 1400 subatomic particles. I will guess that each cell is on average made of 1,000 organic molecules. Then each cell can be modeled by 1,400,000 subatomic particles, which can be modeled by $6.72 \times 10^{8}$ bits, or 6.7 hundred billion.
  • Then the state of each cell can be updated with 670 billion computational operations, so in 10 years, we can update the state of a cell 5 quintillion divided by 670 billion times, or about 7.44 million. This means we could update the state of a single cell 7.44 million times, or of 7.44 million cells a single time, for example. If we wanted to find an equal proportion of number of cells and times updated, it would be about 2,727 cells, updated 2,727 times.
  • According to my estimate, a modern computer could only fully model about 3000 cells updating their state about 3000 times, and it would take about ten years. Of course, this estimate needs more work.
  • How does this translate into quantum computing? The processing power of a quantum computer scales exponentially, so you need far fewer quantum circuits before you can effectively model the entire universe. Is there a realistic bound in the near future expected for what kinds of systems can be completely modeled by quantum computers? Could a quantum computer model an entire homo sapiens body in the atomic level, in a usable time frame?
#1: Initial revision by user avatar Julius H.‭ · 2024-02-25T20:02:28Z (9 months ago)
Is there an expected upper bound on the processing abilities of quantum computers?
A computer can store some number $n$ of bits in memory, and it can perform some number $m$ of essential computational operations, such as addition or multiplication, and rewrite those bits to memory, per some unit of time, $t$.

To give a rough estimate, consider a modern digital computer with 1 terabyte of memory, and a CPU clockrate of 10GHz (current world records, using overclocking, are approaching this number). This translates to $1 \times 10^{12} \times 8$ bits in memory (longterm storage), and $10 \times 10^{9}$ cycles per second. (Unfortunately, I don’t know the significance of “cycles”, as opposed to what I tried to describe above, “a unit of calculation”, let us say.)

That means our CPU can (conjecturally) modify $10^10$ bits per second (10 billion), and stores $8 \times 10^{12}$ bits total (8 trillion).

It seems that the current capacity for storage far exceeds the current capability for computational speed, so the limiting factor on how large of some kind of data model of a system can be will be time. Let’s assume we have enough storage to perform any kind of computational simulation that would complete in a “reasonable” amount of time. For the sake of obtaining useful knowledge, I’ll cap this at 10 years.

This means that because we can compute 10 billion bits per second, in 10 years, we could process $10 \times 10^{9} \times 60 \times 60 \times 24 \times 365.25 \times 10$ which is roughly $3.15 \times 10^{18}$, or about 3 quintillion. My estimation here may be inaccurate.

What kind of system can be fully modeled in 3 quintillion computational operations?

Imagine modeling some subsystem of the homo sapiens organism/body. If we go as low as subatomic physics, each subatomic particle is comprised of roughly 3 parameters taking rational values: mass, spin, and charge. (There are 17 known such particles.)

To represent a rational number in $n$ decimal places, imagine it as a sequence, or linked list, of digits. The 10 distinct digits can be represented in binary with 4 bits ($2^{4} = 16$). Then say an $n$-decimal place precise rational number requires $16 \times n$ bits. Let’s assume we will represent rational numbers to 10 decimals points of precision. Then to represent the data of any subatomic particle, we need $3 \times 16 \times 10$ bits, or 480 bits.

To update the data of a subatomic particle, I am going to keep my estimation simple and assume we may have to perform one computation on each bit. We can process 3 quintillion bits in 10 years, meaning we could update the state of $\frac{3 \times 10^{18}}{480} = 6.25 \times 10^{15}$ subatomic particles in 10 years, or about 6 quadrillion.

The human body is composed of cells, which are composed of organic molecules, which are composed of atoms, which are composed of subatomic particles. Let’s estimate that the average organic molecule is composed of 10 atoms (some more, some less), and the average atom is composed of 20 protons, neutrons, and electrons each. A proton and neutron each are made of 3 subatomic particles, and an electron is a subatomic particle. Then each atom is on average made of $20 \times 3 \times 20 \times 3 \times 20 = 140$ subatomic particles, and each organic molecule of 1400 subatomic particles. I will guess that each cell is on average made of 1,000 organic molecules. Then each cell can be modeled by 1,400,000 subatomic particles, which can be modeled by $6.72 \times 10^{8}$ bits, or 6.7 hundred billion.

Then the state of each cell can be updated with 670 billion computational operations, so in 10 years, we can update the state of a cell 5 quintillion divided by 670 billion times, or about 7.44 million. This means we could update the state of a single cell 7.44 million times, or of 7.44 million cells a single time, for example. If we wanted to find an equal proportion of number of cells and times updated, it would be about 2,727 cells, updated 2,727 times.

According to my estimate, a modern computer could only fully model about 3000 cells updating their state about 3000 times, and it would take about ten years. Of course, this estimate needs more work.

How does this translate into quantum computing? The processing power of a quantum computer scales exponentially, so you need far fewer quantum circuits before you can effectively model the entire universe. Is there a realistic bound in the near future expected for what kinds of systems can be completely modeled by quantum computers? Could a quantum computer model an entire homo sapiens body in the atomic level, in a usable time frame?