The Naked Scientists
  • Login
  • Register
  • Podcasts
      • The Naked Scientists
      • eLife
      • Naked Genetics
      • Naked Astronomy
      • In short
      • Naked Neuroscience
      • Ask! The Naked Scientists
      • Question of the Week
      • Archive
      • Video
      • SUBSCRIBE to our Podcasts
  • Articles
      • Science News
      • Features
      • Interviews
      • Answers to Science Questions
  • Get Naked
      • Donate
      • Do an Experiment
      • Science Forum
      • Ask a Question
  • About
      • Meet the team
      • Our Sponsors
      • Site Map
      • Contact us

User menu

  • Login
  • Register
  • Home
  • Help
  • Search
  • Tags
  • Member Map
  • Recent Topics
  • Login
  • Register
  1. Naked Science Forum
  2. Profile of Jarek Duda
  3. Show Posts
  4. Topics
  • Profile Info
    • Summary
    • Show Stats
    • Show Posts
      • Messages
      • Topics
      • Attachments
      • Thanked Posts
      • Posts Thanked By User
    • Show User Topics
      • User Created
      • User Participated In

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

  • Messages
  • Topics
  • Attachments
  • Thanked Posts
  • Posts Thanked By User

Topics - Jarek Duda

Pages: [1] 2
1
Physics, Astronomy & Cosmology / What is the size and shape of single optical photon?
« on: 30/04/2021 05:23:37 »
Optical photon is produced e.g. during deexcitation of atom, carrying energy, momentum and angular momentum difference.
So how is this energy distributed in space - what is the shape and size of single photon?

Looking for literature, I have found started by Geoffrey Hunter, here is one of articles: "Einstein’s Photon Concept Quantified by the Bohr Model of the Photon" https://arxiv.org/pdf/quant-ph/0506231.pdf

Most importantly, he claims that such single optical photon has shape similar to elongated ellipsoid of length being wavelength λ, and diameter λ/π (?), providing reasonably looking arguments:
Quote
1) Its length of λ is confirmed by:

– the generation of laser pulses that are just a few periods long;
– for the radiation from an atom to be monochromatic (as observed), the emission must take place within one period [10];
– the sub-picosecond response time of the photoelectric effect [11];

2) The diameter of λ/π is confirmed by:

– he attenuation of direct (undiffracted) transmission of circularly polarized light through slits narrower than λ/π: our own measurements of the effective diameter of microwaves [8,p.166] confirmed this within the experimental error of 0.5%;
– the resolving power of a microscope (with monochromatic light) being  “a little less than a third of the wavelength”; λ/π is 5% less than λ/3, [12];

Is it the proper answer?
Are there other reasonable answers, experimental arguments?

Updates: Paper by different author: https://arxiv.org/pdf/1604.03869
Quote
the length of a photon is half of the wave length, and the radius is proportional to square root of the wavelength
2021 "The size and shape of single photon" http://dx.doi.org/10.4236/oalib.1107179
Related: https://physics.stackexchange.com/questions/612110/is-it-possible-to-confine-a-photon-in-less-than-its-wavelength

Here is some paper trying to model emission of photon from hydrogen: https://link.springer.com/chapter/10.1007/0-306-48052-2_20

2
Physics, Astronomy & Cosmology / Similarity between particle physics and superfluid e.g. fluxons?
« on: 07/02/2021 08:29:37 »
Especially in superconductors/superfluids there are observed so called macroscopic quantum phenomena ( https://en.wikipedia.org/wiki/Macroscopic_quantum_phenomena ) - stable constructs like fluxon/Abrikosov vortex quantizing magnetic field due to toplogical constraints.
There is observed e.g. interference ( https://journals.aps.org/prb/abstract/10.1103/PhysRevB.85.094503 ), tunneling ( https://journals.aps.org/prb/pdf/10.1103/PhysRevB.56.14677 ), Aharonov-Bohm (https://www.sciencedirect.com/science/article/pii/S0375960197003356  ) effects for these particle-like objects.

It brings question if this similarity with particle physics could be taken further? How far?
E.g. there is this famous Volovik's "The universe in helium droplet" book ( http://www.issp.ac.ru/ebooks/books/open/The_Universe_in_a_Helium_Droplet.pdf ).
Maybe let us discuss it here - any interesting approaches?

For example there are these biaxial nematic liquid crystals: of molecules with 3 distinguishable axes.
We could build hedgehog configuration (topological charge) with one these 3 axes, additionally requiring magnetic-like singularity for second axis due to hairy-ball theorem ... doesn't it resemble 3 leptons: asymptotically the same charge (+magnetic dipole), but with different realization/mass?


3
Physics, Astronomy & Cosmology / What is atomic orbital from QM interpretations perspective?
« on: 19/10/2020 06:08:22 »
While electron and proton being far apart are allowed to be imagined as nearly point particles, when they approach ~10^-10m (or much more for Rydberg atoms), electron is said "to become" this relatively huge wavefunction - orbital, describing probability distribution of finding electron (confirmed experimentally e.g. https://journals.aps.org/prb/abstract/10.1103/PhysRevB.80.165404 ).

Can we specify in what e-p distance this qualitative change happens?

How to think about this orbital from QM interpretations perspective - is it superposition of electron (indivisible charge) being in all these places? Is electric field of orbital a superposition over electron being in all places, or rather mean?

E.g. in Many Worlds Interpretation, should we imagine that electron has different position in each World?

In such superposition each electron is staying or moving? If staying, where e.g. the orbital angular momentum comes from? If moving, why no synchrotron radiation?

4
Physics, Astronomy & Cosmology / Is Stern-Gerlach experiment a proper idealization of measurnment?
« on: 18/10/2020 08:33:43 »
I have met statements that Stern-Gerlach experiment can be seen as idealization of quantum measurement: we start with random direction of spin (continuous), end with parallel or anti-parallel alignment (discrete).

Is it a proper analogy/idealization of measurement? How to characterize the differences?
Are there some interesting different analogies?

In Stern-Gerlach we have magnetic dipole traveling in external magnetic field, what means torque (tau = mu x B) hence precession - additional energy e.g. kinetic ... unless parallel or anti-parallel alignement (tau=0), so can it be seen as radiation of excessive energy like for atom deexcitation?

ps. Recent "Tracking the Dynamics of an Ideal Quantum Measurement" https://journals.aps.org/prl/abstract/10.1103/PhysRevLett.124.080401

5
Physics, Astronomy & Cosmology / How to choose random walk, diffusion? (local vs global entropy maximization)
« on: 03/09/2020 06:35:57 »
To choose random walk on a graph, it seems natural to to assume that the walker jumps using each possible edge with the same probability (1/degree) - such GRW (generic random walk) maximizes entropy locally (for each step).
Discretizing continuous space and taking infinitesimal limit we get various used diffusion models.

However, looking at mean entropy production: averaged over stationary probability distribution of nodes, its maximization leads to usually a bit different MERW: https://en.wikipedia.org/wiki/Maximal_entropy_random_walk

It brings a crucial question which philosophy should we choose for various applications - I would like to discuss.

GRW
- uses approximation of (Jaynes) https://en.wikipedia.org/wiki/Principle_of_maximum_entropy
- has no localization property (nearly uniform stationary probability distribution),
- has characteristic length of one step - this way e.g. depends on chosen discretization of a continuous system.

MERW
- is the one maximizing mean entropy, "most random among random walks",
- has strong localization property - stationary probability distribution exactly as quantum ground state,
- is limit of characteristic step to infinity - is discretization independent.

Simulator of both for electron conductance: https://demonstrations.wolfram.com/ElectronConductanceModelsUsingMaximalEntropyRandomWalks/
Diagram with example of evolution and stationary denstity, also some formulas (MERW uses dominant eigenvalue):


6
Physics, Astronomy & Cosmology / What does it mean that physics is time - CPT symmetric?
« on: 02/08/2020 09:13:38 »
Time/CPT symmetry is at heart of many models of physics, like unitary evolution in quantum mechanics, or Lagrangian formalism we use from classical mechanics, electromagnetism, up to general relativity and quantum field theories.
In theory we should be able to decompose any scenario (history of the Universe?) into ensemble of Feynman diagrams, apply CPT symmetry to all of them, getting CPT analogue of entire scenario (?)

There are many QM-based experiments which kind of use time symmetry (?), for example (slides with links):
Wheeler experiment, delayed choice quantum eraser (DCQE), “asking photons where they have been”, “photonic quantum routers”, Shor algorithm as more sophisticated DCQE.

However, this symmetry is quite nonintuitive, very difficult to really accept – mainly due to irreversibly, thermodynamical counterarguments (are there other reasons?)
Can e.g. this conflict with 2nd law of thermodynamics be resolved by just saying that symmetry of fundamental theories can be broken on the level of solution, like throwing a rock into symmetric lake surface?
Are all processes reversible? (e.g. wavefunction collapse, measurement)

So is our world time/CPT symmetric?
What does it mean?

Personally I interpret it that we live in 4D spacetime, (Einstein's) block universe/eternalism: only travel through some solution (history of the universe) already found in time/CPT symmetric way, like the least action principle or Feynman path/diagram ensemble - is it the proper way to understand this symmetry?
Are there other ways to interpret it?


7
Physics, Astronomy & Cosmology / What happens with quarks in nuclei? Do they help with binding?
« on: 28/06/2020 09:19:17 »
Nucleons being built of 3 quarks is a seen as a common knowledge.
So what happens with quarks when multiple nucleons bind into a nucleus?

This seems kind of a basic question, also about charge distribution in nuclei, but it is really tough to find any reliable materials about it - any thoughts?

Some more detailed questions:
- To model e.g. dueteron + neutron/proton scatterings, there are fitted ~40 parameters nucleon-focused models which require 3-body forces ( https://en.wikipedia.org/wiki/Three-body_force ), neglecting quarks - would we still need 3-body forces if including quarks into considerations?
- We know deuteron has large quadrupole moment ( https://en.wikipedia.org/wiki/Deuterium#Magnetic_and_electric_multipoles ), what naively requires multiple charges - why quarks are not considered here?
- Or: what is charge distribution of neutron ("built of 3 quarks")? I have seen some papers claiming positive charge and negative shell, e.g. http://www.actaphys.uj.edu.pl/fulltext?series=Reg&vol=30&page=119

8
Physics, Astronomy & Cosmology / Time crystal (self organizing into periodic process) - is electron its example?
« on: 19/03/2020 08:22:25 »
Time crystals first described by Frank Wilczek in 2012 have got a lot of attention, recent popular review: https://physicsworld.com/a/time-crystals-enter-the-real-world-of-condensed-matter/

If I properly understand, they would like a lowest energy state spontaneously self-organizing into a periodic process - and propose sophisticated e.g. solid state experiments, or ping-pong of Bose-Einstein condensate, which don't really seem to satisfy this defining requirement (?)

But Louis De Broglie has already postulated in 1924 that with electron's mass there comes some ≈10^21 Hz intrinsic oscillation: E = mc^2 = hf = hbar ω, obtained if using E=mc^2 rest mass energy in stationary solution of Schrödinger's equation: ψ=ψ0 exp(iEt / hbar). 
Similar oscillations come out of solution of Dirac equation - called Zitterbewegung ("trembling motion"). Here is one of its experimental confirmation papers - by observing increased absorption when ticks of such clock agree with spatial lattice of silicon crystal target: https://link.springer.com/article/10.1007/s10701-008-9225-1

Electron can be created together with positron from just 2 x 511keV energy of EM field - after which it (the field?) should self-organize into these ≈10^21 Hz intrinsic oscillations.

So can we call electron an example of time crystal?
What other examples of lowest energy state self-organizing into periodic process are there?

ps. Beside self-organization of the lowest energy state into periodic motion (I don't see they got? in contrast to electron), they alternatively want "period doubling": that system oscillating with T period, self-organizes into 2T period process - breaking discrete time symmetry (invariance to shift by T).
So these popular Couders' walkers recreating many quantum phenomena in classical systems (slides with links) also have period-doubling (can they be classified as time crystals?) - here is such plot from this paper, horizontal axis is time, lower periodic process is for liquid surface - externally enforced by some shaker, upper periodic process shows droplet trajectory - self organizing into twice larger period than enforced:


But generally it seems very valuable to find analogies between spatial and temporal phenomena like crystals here.
Great tool for that is Ising model: Boltzmann ensemble among spatial sequences, what mathematically is very similar to Feynman path (temporal) ensemble of QM - using this mathematical similarity, for Ising model we get Born rule, Bell violation, or analogues of quantum computers.
What other phenomena can be translated between spatial and temporal dimensions?

9
Physics, Astronomy & Cosmology / Do Feynman path integrals satisfy Bell locality assumption?
« on: 16/02/2020 08:16:48 »
There are generally two basic ways to solve physics models:

1) Asymmetric, e.g. Euler-Lagrange equation in CM, Schrödinger equation in QM
2) Symmetric, e.g. the least action principle in CM, Feynman path integrals in QM, Feynman diagrams in QFT.

Having solution found with 1) or 2), we can transform it to the second perspective, but generally solutions originally found using 1) or 2) seem to have a bit different properties - e.g. regarding "hidden variables" in Bell theorem.
For example transforming solution found in a symmetric way 2) to asymmetric perspective 1), its state was originally chosen also accordingly to all future measurements - like in superdeterminism.

The asymmetric ones 1) like Schrödinger equation usually satisfy assumptions used to derive Bell inequality, which is violated by physics - what is seen as contradiction of local realistic "hidden variables" models. Does it also concern the symmetric ones 2)?


We successfully use classical field theories like electromagnetism or general relativity, which assume existence of objective state of their field - why Bell theorem is not a problem here: how does this field differ from local realistic "hidden variables"?

Wanting to resolve this issue, there are e.g. trials to undermine the locality assumption by proposing faster-than-light communication, but these classical field theories don't allow for that.

So I would like to ask about another way to dissatisfy Bell's locality assumption: there is general belief that physics is CPT-symmetric, so maybe it solves its equations in symmetric ways 2) like through Feynman path integrals?

Good intuitions for solving in symmetric way provides Ising model, where asking about probability distribution inside such Boltzmann sequence ensemble, we mathematically get Pr(u)=(psi_u)^2, where one amplitude comes from left, second from right, such Born rule allows for Bell-violation construction. Instead of single "hidden variable", due to symmetry we have two: from both directions.

From perspective of e.g. general relativity, we usually solve it through Einstein's equation, which is symmetric - spacetime is kind of static "4D jello" there, satisfying this this local condition for intrinsic curvature. It seems tough (?) to solve it in asymmetric way like through Euler-Lagrange, what would require to "unroll" spacetime.

Assuming physics solves its equations in symmetric way, e.g. QM with Feynman path integrals instead of Schrödinger equation, do Bell's assumptions hold - are local realistic "hidden variables" still disproven?

---
Update: Born-like formulas from symmetry in Ising model (Boltzmann sequence ensemble): Pr(i)=(ψ_i)^2 where one amplitude ("hidden variable") comes from left, second from right:


10
Physiology & Medicine / Provisional vaccine for fast spreeding new viruses?
« on: 31/01/2020 07:47:30 »
Developing a standard vaccine for coronavirus will take at least a few months - what might be too late.
However, its sequence is already known, and is nearly identical - suggesting recent single point of origin for human host.

So the question is if/how there could be quickly started production of some provisional vaccine - not perfect but fast to introduce? Also exploiting the fact that these viruses are now nearly identical.
For example synthesizing its outside proteins and putting them on liposomes - would its introduction to blood have a chance to prepare immune system for the real virus?

11
Physics, Astronomy & Cosmology / Violation of Bell-like inequalities with Ising model?
« on: 15/01/2020 09:33:34 »
Quantum mechanics is equivalent with Feynman path ensemble, which after Wick rotation becomes Boltzmann path ensemble, which can be normalized into stochastic process as maximal entropy random walk MERW.

But Boltzmann path ensemble has also spatial realization: 1D Ising model and its generalizations: Boltzmann distribution among spatial sequences of spins or some more complicated objects.

For 86251550cf1408ab1dae19a9445f6ba9.gif energy of interaction between u and v neighboring spins or something more general, define 6af094b65e02ed7e63f34e5fc2efa036.gif as transition matrix and find its dominant eigenvalue/vector: 565a8ce2477763ce9ba8e4aaf41f4dd3.gif for maximal 3f10c942491c83052ff09064fd6c0bcd.gif. Now it is easy to find (e.g. derived here) that probability distribution of one and two neighboring values inside such sequence are:

49ce6363d0da8b30eae0f8d0b11ba01a.gif
4468deca9e811283aaf72c0f64ecdf1d.gif

The former resembles QM Born rule, the latter TSVF – the two ending a11bd56a0ff5973a5604bb3fc9142b1d.gif come from propagators from both infinities as ebe1dce759c80cae4c2601610738a5fd.gif for unique dominant eigenvalue thanks to Frobenius-Perron theorem. We nicely see this Born rule coming from symmetry here: spatial in Ising, time in MERW.

Having Ising-like models as spatial realization of Boltzmann path integrals getting Born rule from symmetry, maybe we could construct Bell violation example with it?

Here is MERW construction (page 9 here) for violation of Mermin’s Pr(A=B) + Pr(A=C) + Pr(B=C) >= 1 inequality for 3 binary variables ABC, intuitively “tossing 3 coins, at least 2 are equal” (e.g. here is QM violation):



From Ising perspective, we need 1D lattice of 3 spins with constraints – allowing neighbors only accordingly to blue edges in above diagram, or some other e.g. just forbidding |000> and |111>.

Measurement of AB spins is defect in this lattice as above – fixing only the measured values. Assuming uniform probability distribution among all possible sequences, the red boxes have correspondingly 1/10, 4/10, 4/10, 1/10 probabilities – leading to Pr(A=B) + Pr(A=C) + Pr(B=C) = 0.6 violation.

Could this kind of spin lattice construction be realized?

What types of constraints/interaction in spin lattices can be realized?

While Ising-like models provide spatial realization of Boltzmann path integrals, is there spatial realization of Feynman path integrals?

12
Physics, Astronomy & Cosmology / Are all processes time/CPT-reversible, measurement, stimulated emission, BB,...?
« on: 28/08/2019 09:09:56 »
While CPT theorem suggests that all processes have time/CPT-symmetric analogues, there are popular doubts regarding some - starting with measurement:

1)   Example of wavefunction collapse is atom deexcitation, releasing energy - it is reversible, but it requires providing energy e.g. in form of photon to excite back an atom. Can measurement be seen this way - that there is always some accompanying process like energy release, which would need to be also reversed? For example in Stern-Gerlach experiment: spin tilting to parallel or anti-parallel alignment to avoid precession in strong magnetic field - does it have some accompanied process like energy release e.g. as photon? Can it be observed?

2)  Another somehow problematic example is stimulated emission used in laser - causing photon emission, which finally e.g. excites a target, later by light path. Does it have [urlhttps://physics.stackexchange.com/questions/308106/causality-in-cpt-symmetry-analogue-of-free-electron-laser-stimulated-absorbtion]time/CPT-symmetric analogue[/url]: some stimulated absorption - causing photon absorption, which e.g. deexcites a target, earlier by light path?

3)  Quantum algorithms usually start with state preparation: all 0/1 qubits are initially fixed to let say <0|. Could there be time/CPT analogue of state preparation: fixing values but at the end (as |0>)?

4)  One of cosmological examples is Big Bang: which hypothesis of the point of start of time seems in disagreement with CPT theorem - instead suggesting some symmetric twin of Big Bang before it, like in cyclic model of universe. Is hypothesis of the point of start of time in agreement with CPT theorem? Could these two possibilities be distinguished experimentally?

What other processes are seen as problematic from time/CPT symmetry perspective?
Which can be defended, and which essentially require some fundamental asymmetry?

13
Physics, Astronomy & Cosmology / How physics can violate Pr(A=B) + Pr(A=C) + Pr(B=C) >= 1 ?
« on: 26/08/2018 06:39:09 »
While the original Bell inequality might leave some hope for violation, here is one which seems completely impossible to violate - for three binary variables A,B,C:

Pr(A=B) + Pr(A=C) + Pr(B=C) >= 1

It has obvious intuitive proof: drawing three coins, at least two of them need to give the same value.
Alternatively, choosing any probability distribution pABC among these 2^3=8 possibilities, we have:
Pr(A=B) = p000 + p001 + p110 + p111 ...
Pr(A=B) + Pr(A=C) + Pr(B=C) = 1 + 2 p000 + 2 p111
... however, it is violated in QM, see e.g. page 9 here: http://www.theory.caltech.edu/people/preskill/ph229/notes/chap4.pdf

If we want to understand why our physics violates Bell inequalities, the above one seems the best to work on as the simplest and having absolutely obvious proof.
QM uses Born rules for this violation:
1) Intuitively: probability of union of disjoint events is sum of their probabilities: pAB? = pAB0 + pAB1, leading to above inequality.
2) Born rule: probability of union of disjoint events is proportional to square of sum of their amplitudes: pAB? ~ (psiAB0 + psiAB1)^2
Such Born rule allows to violate this inequality to 3/5 < 1 by using psi000=psi111=0, psi001=psi010=psi011=psi100=psi101=psi110 > 0.

We get such Born rule if considering ensemble of trajectories: that proper statistical physics shouldn't see particles as just points, but rather as their trajectories to consider e.g. Boltzmann ensemble - it is in Feynman's Euclidean path integrals or its thermodynamical analogue: MERW (Maximal Entropy Random Walk: https://en.wikipedia.org/wiki/Maximal_entropy_random_walk ).

For example looking at [0,1] infinite potential well, standard random walk predicts rho=1 uniform probability density, while QM and uniform ensemble of trajectories predict different rho~sin^2 with localization, and the square like in Born rules has clear interpretation:


Is ensemble of trajectories the proper way to understand violation of this obvious inequality?
Comparing with local realism from Bell theorem, path ensemble has realism and is non-local in standard "evolving 3D" way of thinking ... however, it is local in 4D view: spacetime, Einstein's block universe - where particles are their trajectories.
What other models with realism allow to violate this inequality?

14
Physics, Astronomy & Cosmology / Are particles perfect points? Experimental boundaries for size of electron?
« on: 18/07/2018 08:43:32 »
There is some confidence that fundamental particles are perfect points e.g. to simplify QFT calculations - what experimental evidence do we have, especially for electron?

Electron Wikipedia article only points argument based on g-factor being close to 2: Dehmelt's 1988 paper extrapolating (by fitting parabola to two points!) from proton and triton behavior that RMS (root mean square) radius for particles composed of 3 fermions should be ≈ g−2:



Another argument for point nature of electron might be tiny cross-section, so let's look at it for electron-positron collisions:



As we are are interested in size of resting electron (no Lorentz contraction), we should extrapolate the flat line sigma ~ 1/E^2 to resting electron, getting sigma ~ 100mb corresponding to ~2fm radius.

From the other side we know that two EM photons having 2 x 511keV energy can create electron-positron pair, hence energy conservation doesn't allow electric field of electron to exceed 511keV energy, what requires some its deformation in femtometer scale from E ~ 1/r^2: e5eff0307e30cba9e10fd7567e587090.gif

Can we bound size of electron from above: g-factor or scattering experiments?
Is there other experimental evidence?

15
Technology / Huffman coding being replaced with ANS - ongoing revolution in data compression?
« on: 22/01/2017 07:12:46 »
Beside imaginary data compression revolution in some HBO TV series, it turns out a lot has also actually changed in the real world in the last years - much better compressors you can just download and use.

One reason is more efficient coding.
Huffman coding (e.g. A -> 010) is fast but approximates probabilities with powers of 1/2, while e.g. symbol of probability 0.99 carries only ~0.014 bits of information. This inaccuracy was repaired by arithmetic coding, but it is an order magnitude slower (more costly).
Above compromise has been recently ended with ANS coding, which is both fast(cheap) and accurate:
wiki: https://en.wikipedia.org/wiki/Asymmetric_Numeral_Systems

Arithmetic coding in 2013 had ~50MB/s/core decoding ... now analogous task is made by ANS with ~1500MB/s decoding on the same processor - nearly 30x software boost for the bottleneck of data compressors.
benchmarks: https://sites.google.com/site/powturbo/entropy-coder

ANS is used in data compressors since 2014, like in the currently default Apple LZFSE or great and free Facebook Zstd - it is 2-5x faster than gzip and provides much better compression:
https://github.com/facebook/zstd/
7-zip with zstd: https://mcmilk.de/projects/7-Zip-zstd/
Total Commander plugin: http://totalcmd.net/plugring/zstdwcx.html


16
Physics, Astronomy & Cosmology / Do nonlocal entities fulfill assumptions of Bell theorem?
« on: 01/11/2015 19:24:51 »
While dynamics of (classical) field theories is defined by (local) PDEs like wave equation (finite propagation speed), some fields allow for stable localized configurations: solitons.
For example the simplest: sine-Gordon model, which can be realized by pendula on a rod which are connected by spring. While gravity prefers that pendula are "down", increasing angle by 2pi also means "down" - if these two different stable configurations (minima of potential) meet each other, there is required a soliton (called kink) corresponding to 2pi rotation, like here:


Kinks are narrow, but there are also soltions filling the entire universe, like 2D vector field with (|v|^2-1)^2 potential - a hedgehog configuration is a soliton: all vectors point outside - these solitons are highly nonlocal entities.
A similar example of nonlocal entities in "local" field theory are Couder's walking droplets: corpuscle coupled with a (nonlocal) wave - getting quantum-like effects: interference, tunneling, orbit quantization (thread http://www.thenakedscientists.com/forum/index.php?topic=46639.0 ).
The field depends on the entire history and affects the behavior of soliton or droplet.
For example Noether theorem says that the entire field guards (among others) the angular momentum conservation - in EPR experiment the momentum conservation is kind of encoded in the entire field - in a very nonlocal way.

So can we see real particles this way?
The only counter-argument I have heard is the Bell theorem (?)
But while soliton happen in local field theories (information propagates with finite speed), these models of particles: solitons/droplets are extremaly nonlocal entities.

In contrast, Bell theorem assumes local entities - so does it apply to solitons?

17
Chemistry / Molecular shape descriptors for virtual screening of ligands?
« on: 26/10/2015 08:35:22 »
I was thinking about designing molecular descriptors for the virtual screening purpose: such that two molecules have similar shape if and only if their descriptors are similar.
They could be used separately, or to complement e.g. some pharmacophore descriptors.

They should be optimized for ligands - which are usually elongated and flat.
Hence I thought to use the following approach:
- normalize rotation (using principal component analysis),
- describe bending - usually one coefficient is sufficient,
- describe evolution of cross-section, for example as evolving ellipse

Finally, the shape below is described by 8 real coefficients: length (1), bending (1) and 6 for evolution of ellipse in cross-section. It expresses bending and that this molecule is approximately circular on the left, and flat on the right:




preprint: http://arxiv.org/pdf/1509.09211
slides: https://dl.dropboxusercontent.com/u/12405967/shape_sem.pdf
Mathematica implementation: https://dl.dropboxusercontent.com/u/12405967/shape.nb

Have you met something like that? Is it a reasonable approach?
I am comparing it with USR (ultrafast shape recognition) and (rotationally invariant) spherical harmonics - have you seen other approaches of this type?

18
Physics, Astronomy & Cosmology / The paradox of Hawking radiation - is matter infinitely compressible?
« on: 21/09/2013 14:31:06 »
The hypothetical Hawking radiation means that a set of baryons can be finally transformed, "evaporate" into a massless radiation - that baryons can be destroyed. It requires that this matter was initially compressed into a black hole.
If baryons can be destroyed in such extreme conditions, the natural question is: what is the minimal density/heat/pressure required for such baryon number violation? (or while hypothetical baryogensis - creating more baryons than anti-baryons).
While neutron star collapses into a black hole, event horizon grows continuously from a point in the center, like it this picture from: http://mathpages.com/rr/s7-02/7-02.htm

As radius of event horizon is proportional to mass inside, the initial density of matter had to be infinity. So if baryons can be destroyed, it should happen before starting the formation of event horizon - releasing huge amounts of energy (complete mc^2) - pushing the core of collapsing star outward - preventing the collapse. And finally these enormous amounts of energy would leave the star, what could result in currently not understood gamma-ray bursts.

So isn't it true that if Hawking radiation is possible, then baryons can be destroyed and so black holes shouldn't form?

We usually consider black holes just through abstract stress-energy tensor, not asking what microscopically happens there - behind these enormous densities ... so in neutron star nuclei join into one huge nucleus, in hypothetical quark star nucleons join into one huge nucleon ... so what happens there when it collapses further? quarks join into one huge quark? and what then while going further toward infinite density in the central singularity of black hole, where light cones are directed toward the center?

The mainly considered baryon number violation is the proton decay, which is required by many particle models.
They cannot find it experimentally - in huge room temperature pools of water, but hypothetical baryogenesis and Hawking radiation suggest that maybe we should rather search for it in more extreme conditions?
While charge/spin conservation can be seen that surrounding EM field (in any distance) guards these numbers through e.g. Gauss theorem, what mechanism guards baryon number conservation? If just a potential barrier, they should be destroyed in high enough temperature ...

Is matter infinitely compressible? What happens with matter while compression into a black hole?
Is baryon number ultimately conserved? If yes, why the Universe has more baryons than anti-baryons? If not, where to search for it, expect such violation?
If proton decay is possible, maybe we could induce it by some resonance, like lighting the proper gammas into the proper nuclei? (getting ultimate energy source: complete mass->energy conversion)
Is/should be proton decay considered in neutron star models? Would it allow them to collapse to a black hole? Could it explain the not understood gamma-ray bursts?

19
Geek Speak / How to design a place for massive joint work on creating standards/legislations?
« on: 06/02/2013 08:47:45 »
There are situations when huge amount of people have to jointly work on extremely important documents. For example HTML standards, where a few large players like web browser producers discuss between each other and potentially millions of web designers around the world ... or thousands of politicians/lobbyist while working on compromises of legislations on scale of e.g. USA/EU/world ... or maybe all of us to finally transit toward more direct and transparent democracy ...

How such looking impossible tasks are conducted? Do they use some kind of e.g. TortoiseSVN? Are they transparent enough for interested sides?
I don’t even want to imagine how it is made in politics, but for standards there are for example mailing lists, so to e.g. get to information you are interested in or would like to comment on, you would rather have to dig through huge list of multi-plot comments …

Let us think about designing and maybe creating an open source tool for such serious discussions of potentially huge amount of people ... which then could be applied for different purposes of optimized and transparent work on important documents.
How would a perfect situation look like?
I imagine that from the page of the document we/they work on, I can click on paragraph/sentence to get to a page describing multiple related issues, summarized discussions which lead to its current form, links to these discussions I could participate in, proceeding votes between alternatives …
So e.g. looking at the legislation, everyone could trace each sentence (e.g. to a lobbyist) and understand its evolution to the current form - thanks of this better understanding, interpretations could be closer to the expected result and generally people could better identify with e.g. the law.

So how to design such a place?
Here is a brief description how I would imagine it.
First of all there shouldn't be anonymity there – for really serious discussions, the best would be if every action is digitally signed and this information and generally the whole history is available for all users. So statements there have legal status similar to signed article published by a journalist – think a few times before writing something there. All information about mechanisms used by this tool should be easily accessible (and also discussed and eventually modified). Digital signatures are usually equivalent to the real ones, so by the way this place could be used also e.g. for direct democracy.
Secondly, statements should be relatively compact and rather focusing on a single issue – they should have one main link to what they refer to (and eventually additional links) – the discussion generally is a tree (with eventual less important transverse links), like on reddit but a bit more complicated.

Thirdly, there is required well thought marking system – much more complicated than of reddit. To prevent pathologies, each mark should be signed and well justified … and marks also can be judged and so on.
There would be rather required many different categories of marks - to not just give plus/minus, but also specify and well justify what for. Their direct purpose is to be able to freely customize the order of the list/subtree of related topics to display – from standard chronological through by some category of marks, up to different mixed custom criteria. Another purpose is using these marks in discussions or e.g. to nominate persons with high marks to take care of sites of given issue (his actions would be still fully traceable and evaluated).
There is required some limit of points – for example 1/category/day and can accumulate up to 10/category. They can be spent (with justification) for pluses/minuses in selected categories (e.g. +1 patriotism, -1 realism). The weight of point depends e.g. on total marks of the author in this category. The "/category" is to motivate to look from perspectives of different values on others statements and so on one's own.
Marks of marks influence their weight and generally the weights of marks of the author - there would be required some kind of page-rank to calculate final weights.
Example of list of categories of marks (to discuss):
- Morality / empathy (as external evaluation of situation)
- Altruism/hard work (as own work/sacrifice, minus for selfishness, lazy distributing points)
- Justice/objectiveness (e.g. unjustified marks, lack of objectivity)
- Realism (awareness of the broader situation)
- Patriotism (good for the nation)
- Originality / innovativeness (minus for obviousness, plus for interesting idea)
- Compactness (plus for good essence/form proportion, minus for leading nowhere comments)
- ... ?
Some may have subcategories - like realism in politics, economics, physics ...
More controversial examples:
- Coherence / consistency / transparency - minus for lies, frequent change of opinion (have to be distinguished from the legitimate evolution), plus for mature defense of an idea, the internal consistency, honesty in a difficult situations,
- Openness / flexibility - minus for not adapting to changing realities, ignoring strong arguments, blind fanaticism ... plus for openness to different views, evolution of own thinking.

Besides statements, there would be:
- Profiles of persons/institutions/organizations/companies (with part edited by this subject and part everyone can discuss),
- voting sites - secret (e.g. for final vote) or open (e.g. while choosing between alternatives),
- sites for working on given petition, bill, referendum requests – with links to sites focusing on single sentences, planed deadline to stop working and start gathering signatures,
- wiki-like pages on different subjects and specific topics for discussion, briefly introducing to the problem and results of discussions – with statistics and lots of links.

Another important issue is changeability. I think people could change judges/marks. The main link of statements should be unchangeable, but additional links can be added/updated. Someone could comment on (a part of) the text, so there should be rather possible only adding succeeding updates.

How would you imagine constructing a tool to improve working on important documents?
A tool for serious discussions of potentially huge amount of sometimes extremely interested people?
To increase their level by its construction?
One of many applications could be some National Discussion Forum improving the work on legislations – by making it more transparent and easier for people to express their perspective on concerning them created law.
So finally - what do you think of giving citizens possibility to really take part in (transparent) work/discussion on new legislations?

20
Plant Sciences, Zoology & Evolution / Chiral life concept - creating mirror image synthetic life?
« on: 30/01/2013 23:18:16 »
Imagine mirror image of a living cell - built from scratch using mirror versions of molecules (enantiomers). Symmetry of physics suggests it should work as usual cell, but using e.g. L-sugars instead of our D-sugars: http://wikibin.org/articles/chiral-life-concept.html
There was created first synthetic cells, so maybe it is only a matter of time until they will start building chiral ones - first bags of proteins getting simple prokaryotes, but maybe also complex structure of eukaryotes a few decades later ... or even human?
Some possible applications:
- doubling the amount of possible enzymes we could design and use,
- such organisms would be incompatible with standard pathogens - we could design bottom-up sterile ecosystems for extreme conditions like Mars,
- maybe forever healthy chiral humans immune to our pathogens - which are direct or indirect cause of most of illnesses.

What do you think of such possibility - is it possible to realize? how much time will it take? what other applications could you think of? what would be implications to our society? ...

Pages: [1] 2
  • SMF 2.0.15 | SMF © 2017, Simple Machines
    Privacy Policy
    SMFAds for Free Forums
  • Naked Science Forum ©

Page created in 0.109 seconds with 63 queries.

  • Podcasts
  • Articles
  • Get Naked
  • About
  • Contact us
  • Advertise
  • Privacy Policy
  • Subscribe to newsletter
  • We love feedback

Follow us

cambridge_logo_footer.png

©The Naked Scientists® 2000–2017 | The Naked Scientists® and Naked Science® are registered trademarks created by Dr Chris Smith. Information presented on this website is the opinion of the individual contributors and does not reflect the general views of the administrators, editors, moderators, sponsors, Cambridge University or the public at large.