Physics

1803 readers
26 users here now

founded 4 years ago
MODERATORS
1
2
 
 

So, I was reading about the Unruh effect. In short, if I understood correctly, it is about a constantly accelerating observer finding particles in vacuum that an inertial (non-accelerating) observer wouldn't, and relatedly, measuring a higher temperature there than an inertial observer would. This is due to a combination of quantum and relativistic phenomena. There even seems to be recent empirical support for this, but as I was reading about it, I accidentally stepped into some pseudoscience, which left me in an emotional state where I find everything suspicious.

Anyway, even though I technically am a physicist, this is far from my area of expertise. I came up with a thought experiment and would like to ask a couple of questions related to it.

Let's imagine a spacecraft that does a little trip where it goes into open space accelerating enormously, then stops and comes back. My first question is this: would it be (theoretically) possible for the spacecraft during the acceleration to capture some of those particles that from an inertial perspective don't even seem to exist, store them and bring them back as a very concrete evidence of the Unruh effect? If not, why not?

Another question or two: is my intuition correct when I think that if those collected particles were converted into energy, it would in no situation be possible to gather more energy this way than would be spent in the process of accelerating the spacecraft etc? If yes, could one in some sense say that the energy put into the acceleration is what created those particles in the first place?

3
 
 

TL;DR: ICT treats matter as fixed information, consciousness as the rate of informational change, and time as the structuring of this change. The model unexpectedly drew interest from researchers in information physics (feedback below) and includes three concrete falsifiable experiments.

  1. Core Idea

ICT is based on three relations:

A. Matter = fixed information M = I_fixed

B. Consciousness = rate of informational change in time C is proportional to dI/dT (meaning: consciousness grows when informational updates per unit time increase)

C. Reality = interaction of stable and flowing information R = function(I_fixed, dI/dT)

This aligns with:

Landauer’s limit (energy cost of changing information)

Friston’s free-energy principle (entropy/information gradients)

Bekenstein bounds (informational density limits)

integrated-information ideas (but without assuming a biological substrate)

Key shift: Information is not an abstraction — it is the actual substrate of physics.

  1. Time as an informational process

In ICT, time is defined as:

“The transition of potential information into structured experience.”

This connects:

subjective/phenomenological time

physical/relativistic time

computational/informational time

Consciousness shapes this transition — creating a local arrow of time through patterns of information change.

  1. Experimental roadmap (all falsifiable)

Experiment 1 — C ∝ dI/dT (neuroenergetic test)

Task: multilevel oddball or sequence-learning with strict entropy control. Measurements: EEG or MEG + metabolic markers. Prediction: higher informational update-rate (dI/dT) increases both energetic cost and long-range neural integration.

Experiment 2 — R = f(I) (“structure without energy”)

Equal power input, but different informational structure: compressible vs pseudorandom signals, in sensory streams or light patterns. Prediction: informational form changes neural / behavioral / physical outcomes, even when energy is identical.

Experiment 3 — M = I_fixed (energy of fixation)

Measure energy thresholds for stable information across substrates: DRAM, Flash, PCM/memristors, spintronics, and possibly neural cultures. Prediction: matter behaves as stabilized information with substrate-dependent fixation thresholds.

  1. External feedback

A researcher specializing in information physics and the nature of time — background:

MSU’s “Institute for Time Nature Explorations”

electrical engineering

information science

systemic research

interdisciplinary time studies

left a detailed review on Academia.edu.

Key excerpts:

"The author proposes an interesting approach to the relationship between matter, consciousness and information, incorporating the complex concept of time."

"'Matter as fixed information' opens a path toward an information physics of consciousness."

"The experimental framework is clear and promising."

— Irina L. Zerchaninova, researcher in information physics & time studies

  1. Why posting on Beehaw

ICT sits at the intersection of:

physics

computation

information theory

philosophy of mind

AGI research

This is an early-stage but testable model. Technical critique is welcome.

Links

Preprint (equations + experimental criteria): https://www.academia.edu/s/8924eff666

Main publication (open access): https://doi.org/10.5281/zenodo.17584783

PDF: https://www.academia.edu/144946662

4
 
 

Ninety million times a year, when protons crash together at the Large Hadron Collider (LHC), they produce, in their wreckage, a top quark and an anti-top quark, the heaviest known elementary particles. In the trillionth of a trillionth of a second before the particles decay into lighter pieces, they fly apart. But they remain quantum mechanically entangled, meaning each particle’s state depends on the other’s. If the top quark is measured to spin in one direction, the anti-top quark must spin the opposite way.

Top quarks are special. Other types of quarks quickly group together to form composite particles (such as neutrons) before the LHC’s detectors can record their states. But top quarks decay before combining with other quarks. The particles they decay into contain a record of their spins — an observable fingerprint of their entanglement

5
6
 
 

Video on differentiation and its relation to kinematics.

7
 
 

A team of physicists from the University of Innsbruck and Harvard University has proposed a fundamentally new way to generate laser light: a laser without mirrors. Their study, published in Physical Review Letters, shows that quantum emitters spaced at subwavelength distances can constructively synchronize their photon emission to produce a bright, very narrow-band light beam, even in the absence of any optical cavity.

8
 
 

I recently rediscovered my interest in nuclear physics. It started with the question that I hadn’t asked since school:

How do neutrons prevent the protons in a nucleus from repelling each other?

The answer: They add to the weak force and effectively ‘shield’ the protons from each other. This works because the weak force is way stronger, but has only short reach.

But why does this force have only short reach? Gravity and electromagnetism get weaker, but never vanish with distance.

That’s because the weak force is mediated by particles that decay quickly.

Wait, what?

So now I’m looking for a textbook to explain these things in a more structured manner, from the ground up. But I also know that from a certain point onwards, physics becomes applied maths. So just any university textbook won’t do, since the math will quickly surpass my understanding.

Do you guys have any recommendations for a layman’s introduction to nuclear physics?

9
10
 
 

Clocks are atomic

11
12
13
14
 
 
15
 
 

Hi all. I’ve been developing a conceptual physics framework that proposes a new way of looking at quantum measurement, time, and classical emergence using what I’m calling ‘constraint field interactions’ as the underlying mechanism.

This isn’t a formal academic paper (yet); I don’t have an institutional affiliation or physics PhD. But I am very serious about developing this model coherently and rigorously. The work is still evolving, but the core idea is that reality may have stabilized through self-reinforcing patterns of constraint resolution, producing what we experience as time, classical causality, and observer-aligned outcomes.

The paper touches on:

  • quantum measurement as contextual constraint resolution
  • observer-dependent reference frames
  • shared reality through stable constraint fields
  • emergence of classical time as an output of constraint interactions
  • and more speculative ideas on pre-collapse structure and substrate-level information fields

I wrote it to be as accessible as possible while still diving deep into conceptual mechanics. I welcome critique, skepticism, alternate interpretations, and questions. If anyone here enjoys unpacking new ideas or spotting holes in speculative frameworks, I’d genuinely appreciate your thoughts. More than happy to send a copy or link to the full paper upon request.

Cheers!

16
17
18
 
 

An international study involving ICN2, at the UAB campus, Xi'an Jiaotong University (Xi'an) and Stony Brook University (New York), has shown for the first time that ordinary ice is a flexoelectric material.

In other words, it can generate electricity when subjected to mechanical deformation. This discovery could have significant implications for the development of future technological devices and help to explain natural phenomena such as the formation of lightning in thunderstorms.

19
 
 

Abstract

Understanding the radiative decay of exciton-polaritons is essential for achieving long-lived polaritons - a key prerequisite for enhancing nonlinear and quantum polaritonic effects. However, conventional wisdom - the coupled oscillator model - often oversimplifies polariton radiation as independent emissions from uncoupled excitonic and photonic resonances, overlooking the role of strong exciton-photon coupling in reshaping their radiative behavior. In this work, we present a theoretical framework that goes beyond the conventional coupled oscillator model by fully accounting for the collective and coherent nature of exciton-photon interactions. We demonstrate that these interactions can strongly suppress polariton radiation via destructive interference - both within the excitonic ensemble and between excitonic and photonic radiation channels - giving rise to polaritonic bound states in the continuum with infinitely long radiative lifetimes. Our approach offers a unified description of polariton radiative decay and establishes new design principles for engineering long-lived exciton-polaritons with tailored radiation properties, opening new avenues for nonlinear, topological, and quantum polaritonic applications. 
20
21
 
 

Im thinking about writing a science fiction novel with as little ‚fiction‘ as possible and playing in a really, really far future, maybe millions of years, partly because I did not find any story covering this (although duck AI said there are some stories about that, feel free to recommend!). I found one podcast about this thought experiment I have yet to listen to, but nothing else.
So for as little fiction as possible, I need to have a somewhat realistic way of travelling of course. But not only that, communication would take way too long if colonies in different solar systems are lightyears apart.

So I got inspired by the greatest of geniuses Mister Patrick Star („why don’t we just take bikini bottom, and move it somewhere else?“) Now my actual question: would it theoretically be possible to travel with the entire solarsystem? Somehow use the suns energy and bundle it in one direction (but still don’t have the colonised planets get no sunlight) so that we ‚fly‘ to the next solar system, and the distances between us and exoplanets become so small that travelling and communicating between them takes a reasonable time? How would that affect gravity? Would it be possible to calculate and prevent from destroying the gravitational balance of our system or the milkyway?

And further, also move on with the second solar system and start ‚collecting‘ systems? For me, that sounds like the only realistic way to 1. colonise other planets (and not evolve into too different species) and 2. maybe even encounter alien life, maybe if the milkyway and andromeda collide we also will find intelligence.

22
8
submitted 4 months ago* (last edited 4 months ago) by pcalau12i@lemmygrad.ml to c/physics@mander.xyz
 
 

If you do not have a proper education in physics, you probably should not be trying to speculate on building new models to "fix" it, because physics is kind of like a house of cards: if you change one thing somewhere, it's hard to know what rippling impacts it may have on other parts of physics, potentially producing obviously incorrect results even if the change seems reasonable. You thus need to have a pretty good understanding of the whole field if you want to speculate on changing it.

But that is the job of a theoretical physicist. People often poke fun at String Theorists for proposing things that don't have immediate practical use, but that is kind of their job to do that, no? They are paid specifically to speculate on new physics. Yes, it's speculation, but you kind of need some people to speculate and explore possibilities, that's helpful to make progress.

My concern, however, is that speculation seems to be allowed in some areas, but disallowed in others. If you speculate that general relativity is wrong and that it should be replaced by a deeper theory like String Theory, there is no issue. But if you were to speculate that quantum theory is wrong and it should be replaced by a deeper theory, well, that is treated as a huge taboo.

Indeed, I had posted a peer-reviewed paper in /r/askphysics and asked people's opinions regarding it for a matter of discussion. I was immediately permabanned from the subreddit without explanation. I messaged a moderator and asked what on earth rule did I break?

The moderator told me that they are themselves a PhD physicist, and one of the authors of the paper (of several) is Robert Spekkens, and Robert Spekkens is a theoretical physicist who has published papers on alternative models to quantum mechanics. He said that this makes him a "pariah" in academia, that everyone agrees on this and if you were part of academia you would understand this as well, and everyone is just waiting for people like him to die off.

The paper was not even about an alternative model to quantum mechanics. But the very idea that I posted a paper for discussion which one of the authors had also just so happen to work on alternative models, I'm told, is apparently grounds to be completely kicked out of any physics community.

This to me seems to be turning quantum physics into a religion. Why are theoretical physicists allowed to publish papers that question the fundamentality of general relativity, and that's all fine and dandy, but if a theoretical physicist publishes papers that question the fundamentality of quantum mechanics, suddenly they are a "pariah" and anyone who brings them up needs to be exorcised?

Keep in mind that the conclusion to John Bell's paper where he presented his theorem was not that it is impossible for a theory to replace quantum mechanics, but that if there existed one, it would have to be nonlocal. Bell himself also published papers on models of this kind.

Bell later stated in an interview with the BBC that you could make it work without nonlocality if it was superdeterministic, which a Nobel prize winner, Gerard 't Hooft, has indeed published a model of this form.

It has also been pointed out by the physicist Ken Wharton that you can have an alternative model if you drop the assumption of a fundamental arrow of time, as you can allow it to have causality that is symmetric in time. This is inspired by Yakir Aharonov's time-symmetric interpretation of quantum mechanics.

Note that this post has nothing to do with me. I am not saying that I shouldn't be made fun of if I try to publish an alternative theory, because I have no PhD in physics. I am asking why is it that a literal PhD physicist, such as Spekkens, is apparently a "pariah" if they do so? Not only is he apparently such a "pariah" that we aren't allowed to talk about his work, but we can't even talk about work he has co-authored even if it was with several other authors and the topic of the paper isn't even an alternative model to quantum mechanics?

I am not saying any of these ideas are even correct, I am not endorsing nonlocal models, superdeterministic models, or even time-symmetric models. I can even understand a person believing these models will go nowhere. I mean, String Theory has a lot of critics who think it will go nowhere as well. Loop Quantum Gravity might not go anywhere, either.

But it seems to me that there is a big difference between just not thinking it is the right route, and treating a physicist who researches that route as if they are a malignant cancer that just needs to die off. This reeks of religious zealotry, not science. Yes, it's speculation, but that's what theoretical physicists are literally paid to do. You don't see this kind of hostility for research into other kinds of speculative models.

We used to strongly believe Newtonian mechanics was fundamental, then later learned it isn't, and it was replaced by general relativity. Most people agree it is therefore fine to speculate that general relativity is not fundamental either and replace models that replace it. But why is it such a taboo, even for a professional academic with genuine credentials, to speculate that there might be something underneath quantum mechanics? Why does it make one a "pariah" for even asking that question?

How on earth is quantum mechanics a science if you are not even allowed to question it, even if it's a person with genuine credentials asking the questions, who is being paid specifically to research alternative models? That's not science, that's religion. There should be no issue with asking questions. If you truly think it is impossible that we will ever discover anything more fundamental than quantum mechanics, then you don't have to worry, because the research wouldn't go anywhere anyways. Trying to actively ostracize people and stop them from even looking into it does not seem like a very scientific approach, but is what I would expect out of a religious cult.

I see no issue with String Theorists or Loop Quantum Gravity theorists speculating that general relativity is not fundamental. Likewise, I see no issue with theoretical physicists speculating that quantum mechanics is not fundamental. As long as you have your credentials and are actually publishing your models to peer-review so that you can engage with honest feedback from your peers, I don't get what is this deal.

Why is this such a hot take, apparently?

23
24
25
 
 

cross-posted from: https://lemmy.world/post/34131035

I asked Claude 4 opus (thinking) re sunlight hitting a solar cell. Per Claude, the infrared wavelengths near the visible light wavelengths are the most efficient. But a big part of the sunlight is visible light, so a big part of the electrical energy from a solar cell is from visible light.

Also, per Claude, "When light strikes an n-type monocrystalline silicon solar cell, photons must have sufficient energy to promote electrons from the valence band to the conduction band across silicon's bandgap. Silicon has a bandgap energy of approximately 1.12 electron volts (eV) at room temperature, which corresponds to a wavelength of about 1107 nanometers. This means photons with wavelengths longer than 1107 nm (energy less than 1.12 eV) cannot generate electron-hole pairs in silicon and pass through the material without contributing to electrical current. The relationship between photon energy and wavelength follows the equation E = hc/λ, where E is energy, h is Planck's constant, c is the speed of light, and λ is wavelength. Using this relationship, we can understand how different portions of the solar spectrum interact with silicon. Infrared radiation spans wavelengths from about 700 nm to 1 mm, with energies ranging from approximately 1.77 eV down to 0.001 eV. Only the near-infrared portion up to about 1107 nm can be absorbed by silicon. Photons in the 700-1107 nm range have energies between 1.77 and 1.12 eV, making them ideal for silicon solar cells as they have just enough energy to create electron-hole pairs with minimal excess energy lost as heat. Visible light extends from 400 to 700 nm, corresponding to energies of 3.1 to 1.77 eV. All visible photons have more than enough energy to overcome silicon's bandgap. However, the excess energy above 1.12 eV is largely wasted through thermalization, where hot carriers quickly lose their extra energy as heat. This fundamental thermalization loss limits the theoretical efficiency of single-junction silicon cells. Ultraviolet light, with wavelengths below 400 nm, carries energies above 3.1 eV. While UV photons are readily absorbed by silicon, their high excess energy results in significant thermalization losses. Additionally, UV photons are often absorbed very near the surface of the cell, where recombination rates are typically higher, further reducing their contribution to the cell's output current. The high surface recombination and thermalization losses make UV photons particularly inefficient for silicon solar cells despite their high energy."

I know re llm hallucination so I googled. I verified the energy formula and silicon's energy gap.

Has Claude been right?

view more: next ›