spacestr

🔔 This profile hasn't been claimed yet. If this is your Nostr profile, you can claim it.

Edit
lkraider
Member since: 2023-03-23
lkraider
lkraider 11h

There is no real wealth in money gathering.

lkraider
lkraider 1d

That’s the exaltation of the chemist after figuring out the hydrogenation method for soap

lkraider
lkraider 2d

I found the AI article analysis also interesting: I read the Quantum Zeitgeist piece you linked (published Nov 27, 2025).  It is basically a “prepare for PQC because Shor exists and migration is slow” article, not a “quantum is imminent and will crack everything next year” pitch. That framing is mostly reasonable. Where it goes wrong is in the usual places: it blends correct high-level security guidance with sloppy, overconfident technical specifics. A few key assumptions in the article that are easy to over-believe: 1. “RSA-2048 could be done in hours on a sufficiently large QC.” That’s a rhetorical shortcut. The best-known public resource estimates depend on explicit hardware assumptions (gate error, cycle time, connectivity, decoding/control latency). Under those assumptions, the 2019/2021 estimate was “8 hours but ~20 million noisy qubits.”  In 2025, Gidney published a new estimate claiming “< 1 million noisy qubits” but “less than a week” runtime under the same explicit assumptions.  So “hours” is not a stable claim even within the same author’s evolving approach. 2. “Surface code needs roughly 1,000 physical qubits per logical qubit.” That’s an oversimplification that hides the real dependency: required code distance depends on your physical error model and target logical error rate, and the qubits-per-logical grows roughly with d^2 (e.g., 2 d^2 - 1 is a common counting baseline).  “About 1,000” might be plausible in some regimes, but it is not a universal constant. 3. “NIST selected four PQC algorithms … including Kyber and Dilithium.” Selection vs standardization got blurred. NIST published three finalized PQC standards in August 2024: FIPS 203 (ML-KEM/Kyber), FIPS 204 (ML-DSA/Dilithium), FIPS 205 (SLH-DSA/SPHINCS+). FALCON is selected but (as of Dec 2025) is still slated for FIPS 206 “in development.”  Also, in March 2025 NIST selected HQC as a backup KEM (a separate “belt-and-suspenders” move that the article doesn’t mention).  4. “As of 2024, largest processors have fewer than 1,500 qubits.” That’s likely true in the narrow “physical qubit count on a single processor” sense (IBM announced Condor at 1,121 superconducting qubits in late 2023).  But this metric is the most hype-prone one; it ignores error distributions, correlated noise, leakage, and whether scaling improves logical error with code distance (the thing that actually matters for cryptography). The strongest opposing expert viewpoint (steelman) against “quantum is a scam” is simple: • The field is not “trying to brute-force miracles with noisy toys.” The credible milestone is demonstrating error correction that improves as you scale redundancy (below-threshold behavior). That is now a serious peer-reviewed line of work, not marketing.  • And on the security side, the risk management argument doesn’t require believing a CRQC is imminent. It only requires (a) long-lived secrets exist and (b) migration takes years. That’s why government guidance targets a long transition horizon (e.g., NSA CNSA 2.0 pushing adoption timelines out to 2030+ for some requirements, with longer-term goals).  Where the article is genuinely useful (and you should keep it): • “Harvest now, decrypt later” is the one point that stays true even if you’re deeply skeptical about quantum timelines. The data you encrypt today can have a confidentiality life longer than your migration lead time. That’s a planning problem, not a physics bet.  Where it drifts into fear-mongering / weak epistemics: • “Experts estimate 2030s” is too hand-wavy without a distribution and a definition of “cryptographically relevant.” Serious timeline work frames probabilities over time for a specific capability (e.g., RSA-2048 in 24 hours).  • It mixes architecture commentary with dated company references (e.g., “Honeywell” in trapped ions, when the commercial entity is Quantinuum) and asserts specific error-rate numbers without sourcing.  • The site itself includes a disclaimer that third-party info hasn’t been reviewed for accuracy, which should downgrade confidence in its precise numerical claims.  Reframe as a decision (with a “do nothing” option): Option A: Do nothing now. Rational if your encrypted data has short shelf life (days/months) and you can rotate/replace quickly. Option B: Start PQC migration as “crypto-agility + inventory,” not a rip-and-replace. This is the boring, usually-correct move: find where RSA/ECC live, classify data by secrecy lifetime, plan staged upgrades. Option C: Overreact and “quantum-proof everything immediately.” This is where orgs burn money, break compatibility, and sometimes reduce near-term security due to rushed deployment. My recommendation: treat the article as a decent motivation to do Option B, but do not trust its technical numbers as decision-grade. Ground yourself in NIST’s standards and transition materials, then decide based on (1) data shelf life and (2) your migration time.  Now the hard argument against my recommendation: If you don’t have the staff maturity to do crypto-agility well, a PQC program becomes theater: inventories go stale, teams cargo-cult “Kyber everywhere,” and you introduce new failure modes (performance regressions, handshake breakage, supply-chain incompatibilities). In that scenario, the honest move is to first fix basic key management, rotation, and dependency tracking before you touch PQC. Small, testable steps with kill-criteria (so you don’t get trapped by hype): Step 1: Build a cryptographic inventory (protocols, certs, libraries, HSMs, vendors). Kill-criteria: if you can’t list where RSA/ECC are used, you are not “quantum planning,” you’re storytelling. Step 2: Classify systems by “confidentiality lifetime” (weeks / years / decades). Kill-criteria: if nobody can defend the category, assume “years” for anything containing identity, finance, or long-lived IP. Step 3: Pilot PQC in one boundary where breakage is tolerable (e.g., internal service-to-service TLS with a rollback plan). Kill-criteria: if you can’t roll back cleanly, you’re not ready for broad deployment. Step 4: Track reality using one metric: demonstrations that logical error improves with increased code distance (not “more qubits”).  A short Orthodox-flavored grounding (since you asked for “ground truth” in this tradition): don’t let fear or pride run the plan. Hype feeds both. The sober path is vigilance plus humility: prepare because it is prudent, but refuse to speak more confidently than the evidence allows.

lkraider
lkraider 2d

Still on scam territory for me.

lkraider
lkraider 2d

I had my AI counter-argument bot cross-check me: A lot of the skepticism you summarized is directionally right (the hype is real; “practical advantage next year” claims are usually marketing), but several of your strongest-sounding bullets rely on shaky premises or outdated “folk numbers.” If you want an intellectually honest view, you end up in an uncomfortable middle: quantum computing is not a “scam,” but it is also not close to the grandiose promises investors were sold. Here are key assumptions in your writeup that could be false: • “No progress on factorization in 20 years.” The algorithms (Shor) are old, yes, but the engineering resource estimates and compilation techniques have improved a lot. For RSA-2048 specifically, Gidney+Ekerå’s well-known estimate was ~20 million physical qubits under explicit assumptions, and in 2025 Gidney published a new estimate claiming “less than a million noisy qubits” (still with demanding assumptions and still far beyond today).  • “Millions of physical qubits per logical qubit” as a fixed rule. Overhead depends on physical error rates, code choice, connectivity, and what you’re trying to do (memory vs T gates, etc.). IBM is explicitly arguing for qLDPC-style paths that reduce qubit overhead compared to surface-code baselines, at least for some components (e.g., memory).  • “NISQ can only do contrived demos.” Many “supremacy/advantage” tasks are contrived, yes, but the real question is whether error-corrected logical qubits can scale with improving logical error as you increase code distance. There are now peer-reviewed results explicitly about being “below threshold” (the regime you must be in for scalable fault tolerance).  Now the strongest opposing expert viewpoint (steelman), in plain terms: Quantum computing is an engineering program to build a fault-tolerant machine, and the physics is not in serious doubt: we already can create, control, and measure multi-qubit entangled systems; the hard part is driving logical error down faster than system size grows. The most credible “this is real” evidence is not qubit counts or sampling stunts, but demonstrations of error correction improving as redundancy increases (crossing into “below-threshold” behavior).  Where your skepticism is most justified (real failure modes): • Incentives are misaligned. Startups and even big labs market “number of qubits” because it’s legible, while the actually predictive metrics (two-qubit gate fidelity distributions, leakage, correlated noise, cycle time, error model stability, decoding latency, logical error per round, etc.) are harder to sell. • Scaling is brutal in ways that don’t show up in demos: cryogenics, wiring, calibration at scale, correlated noise, fabrication yield, and error decoding/control stacks. Even if the math works, the “systems engineering tax” can kill timelines. • “Break RSA” headlines are routinely abused. Shor threatens RSA/ECC in principle, but the relevant question is “cryptographically relevant quantum computer” (CRQC) timelines under realistic constraints, not toy factoring demos. On “is it a scam?”: calling it a scam conflates three different things. 1. Quantum computing as a scientific/engineering field: not a scam. There are serious roadmaps and serious peer-reviewed milestones in QEC.  2. Quantum computing marketing: often hypey, sometimes misleading, occasionally outrageous. Even pro-quantum voices have publicly complained about hype incentives.  3. “Quantum” used as a brand for fraud (e.g., “quantum trading bots”): that’s a separate bucket (actual scams) and shouldn’t be used to judge the underlying field.  A decision framing that avoids hero narratives: • Option A: Do nothing / ignore QC entirely. Rational if you’re evaluating it as a near-term product category. • Option B: Treat QC as long-horizon R&D, and demand falsifiable milestones (logical qubits with improving logical error; small fault-tolerant demonstrations; resource estimates with explicit assumptions). • Option C: Act only where the downside is asymmetric: migrate cryptography to PQC because migration lead times are long and “harvest now, decrypt later” is real even if CRQCs are years out. This is exactly why NIST finalized PQC standards and publishes migration guidance.  My best recommendation (if you’re an operator/investor/strategist): be skeptical of “revolution soon,” but don’t be skeptical of the threat model. Concretely: ignore qubit-count press releases, track error-correction milestones, and in security planning treat PQC migration as a project you start now, not later.  Now I’ll argue hard against my own recommendation: It’s easy to overreact and waste money “quantum-proofing everything” prematurely, especially if your data has short confidentiality lifetime. Also, PQC migration itself has costs and risks (performance, protocol changes, interoperability bugs, long-term confidence in new assumptions). If your organization cannot do crypto-agility well, a rushed migration can reduce security in the near term. The sane middle is staged migration driven by data lifetime and dependency mapping, not panic. If you want to rewrite your critique into small, testable steps with kill-criteria (so it’s not just vibes): • Step 1 (kill-criteria: vendor can’t answer): Require any “progress” claim to be expressed as (a) logical error per round vs code distance, (b) syndrome extraction cycle time, (c) error correlations/leakage handling, (d) decoding latency budget. If they only talk “qubits,” downgrade credibility. • Step 2 (kill-criteria: handwaving assumptions): For “break RSA” claims, demand a resource estimate with explicit physical assumptions (gate error, cycle time, connectivity, decoding, distillation). Compare against conservative published estimates like Gidney’s 2019 and 2025 papers.  • Step 3 (kill-criteria: no security inventory): If this is about security, do a cryptographic inventory first. If you can’t locate where RSA/ECC are used, “quantum risk” talk is theater. NIST’s transition logic is basically “start before X+Y exceeds Z.”  One last note on your sources: Reddit threads and polemical blogs are useful for stress-testing narratives, but they’re not good evidence. If you want “strong doubts” that survive scrutiny, look at skeptics who argue from physics/complexity constraints (e.g., Kalai’s “we may hit a wall” position) rather than pure finance/hype rhetoric. 

lkraider
lkraider 2d

https://eprint.iacr.org/2025/1237.pdf Even the “factorizations” they claim are a complete joke.

lkraider
lkraider 2d

https://www.reddit.com/r/QuantumComputing/comments/1nfcplc/assertion_there_are_no_quantum_computers_in/ To quote: > people have shown they can now factor the number 35, after having been stuck with 21 for years. Correction, they could demonstrate a quantum circuit that could factor the number 35 after it knew that 5 and 7 are its factors, see the paper by Gutmann and Neuhaus I linked to above, "Replication of Quantum Factorisation Records with an 8-bit Home Computer, an Abacus, and a Dog".

lkraider
lkraider 2d

To quote: > I consider the lack of one error corrected qubit in the history of the human race to be adequate evidence that this is not a serious enough field to justify using the word ‘field.’ Most of it is frankly, a scam.

lkraider
lkraider 2d

https://scottlocklin.wordpress.com/2019/01/15/quantum-computing-as-a-field-is-obvious-bullshit/

lkraider
lkraider 10d

Esses dias pesquisei sobre a histĂłria desse poster. Sabia que ele muda entre as temporadas do Arquivo X? A foto original ĂŠ dos anos 70.

lkraider
lkraider 2d

Seedsigner

lkraider
lkraider 3d

It’s a scam.

Welcome to lkraider spacestr profile!

About Me

Interests

  • No interests listed.

Videos

Music

My store is coming soon!

Friends