Transition nets and Markov chains

  • Thread starter Thread starter deferro
  • Start date Start date
  • Tags Tags
    Transition
Click For Summary
SUMMARY

This discussion focuses on the intersection of automata theory, quantum mechanics (QM), and computational models, particularly in the context of information theory. The participants explore the necessity of physical mediums for information transfer, emphasizing Shannon's information principle, which states that communication requires a physical medium and spatial extent in time. They delve into the implications of binary computing on manifolds, the role of Brownian motion in computational processes, and the differences between classical and quantum circuits. The conversation highlights the significance of polynomial time processing and the impact of thermodynamic interference on computation.

PREREQUISITES
  • Understanding of automata theory and its applications
  • Familiarity with quantum mechanics and quantum information (QI)
  • Knowledge of Shannon's information theory and its principles
  • Basic concepts of computational complexity, particularly polynomial time (P-time)
NEXT STEPS
  • Research quantum computing models and their implications for information processing
  • Study the principles of Shannon's information theory in depth
  • Explore the role of Brownian motion in computational systems
  • Investigate the differences between classical and quantum circuits in computation
USEFUL FOR

This discussion is beneficial for researchers and practitioners in quantum computing, theoretical computer science, and information theory, particularly those interested in the foundational principles of computation and information transfer.

deferro
Messages
48
Reaction score
0
Has anyone here done any study with automata?

i'm interested in deriving a general model, in a state + transition encoded way; that's a parsing grammnar approach to start with, then generalize it further to a sort of wavevector/wavefunction algebra; a QM model since I study QI at the mo.
IS needs QM reps that are/are not crammed with complicated "machine instructions" - we need better abstractions of 0,1 in 3 and 2 dimensions. Hall liquids and the AB effect essentially encode these "new" algebras. :)
 
Technology news on Phys.org
Hm. OK; The fundamental principle of information is that: there isn't any if time and space are not real, for any signal s(t). Or, there is no physical way to communicate in zero time, or over any distance unless energy is transferred (as information, spatially encoded in time for signal s).

So that: Temporal + spatial displacements extend (as communications) to surfaces where they are transported, copied/erased, and polarized. There would be no possibility of computation without communication, so that Shannon's "information principle", is that information requires a physical medium or circuit, and spatial extent in time.

Essentially a binary computer is a set of 2-dimensional gates in, or on a manifold M; linear transformations of 'information' (bits of stored charge) in registers which are extensions of 2n bits, or w bits wide, therefore represent a time-independent algebra A, for the manifold M|(s,t), and velocities are of 'logic' as a path for a computation through a number n-1 of registers, where the 1st is the 'input' and the last or nth is the 'output'.

Computational flow is independent of the physical realization of the logic that 'runs' any process; since two different physical computers can determine the same result, but in different amounts of time. Here 'time' is a background-independence; we can build machines that 'compute' physically in indeterminate time, because Brownian motion is the physical 'force' acting on the flow in logical terms.
We might not want to wait centuries or millions of years, but the Brownian machine will give as good or accurate a result as we like - it's a question of design.

We like answers in shorter amounts of time, or polynomial time. P-time rules the C-space, there is no way to determine that a 'program' will take a given path, and halt with the "correct answer"; however, it always has one, even if it's not what was expected.

A machine which is digitized "uses" - u uses v - charge (in electrons) as the deformed potential. Spin or magnetic interactions are overwhelmed by the geometry - there are no inductive currents, except for stray ones. If information is written in another, external form, then magnetic effects can be used to transform the computational and physical basis.

There is a difference; in electronic machines the two are always assumed to be congruent, but this isn't the case in quantum circuits since you only get a result when you perturb the system. Preparing a circuit is equivalent to running an algorithm, the recovery means adjusting the phase of a state so a preserved phase-difference evolves as a pure state.
This is what happens in polarization - sunglasses absorb 1/2 the "glare" = incident light polarized the 'wrong way', and delivers a remainder in parallel mode to you, or to your vision. You are in the quantum world when you wear them, since removing the filters means you see classical glare (yep, still there). The filters process photons either singly or as a group - it doesn't matter because you see the result 'now', not when the glasses actually do anything.

Flow is linear because of the register path, but thermodynamic interference might mean a halt occurs even though the process is evolving to a result. In an electronic/thermodynamic system the interference is unimportant for a temperature within a certain physical limit; so that classical machines are restricted to polynomial time processing by temperature and the physical basis.

The sum + product logic of digital machines means that flow occurs because Boole products remove or erase something, summation doesn't. In that sense summation (OR,XOR) is a diffusion, production (AND) is a dispersion. The first is expansive, the second linear and closed/open for some path through a register map B(x), then M(x')|n> -> M(x)|i> + B|i>; or "output = input x + map B over x(i)". The index i counts up to n-1 and the result is in M(x) after n steps of P-time.

An input x(n) is transformed into a result by M(x) which is a computation C(x)|i> where products make potential (logic) vanish or get absorbed by the path taken, since AND loses one bit per 2-d switch, only one input can be recovered or recalled as the one that 'threw' the switch, the other is lost irrecoverably. AND is not therefore affine but required for logical flow in any M that computes information, by communicating it.
 
Last edited:

Similar threads

  • · Replies 7 ·
Replies
7
Views
4K
Replies
24
Views
8K
  • · Replies 2 ·
Replies
2
Views
624
  • · Replies 10 ·
Replies
10
Views
4K
Replies
9
Views
7K
  • · Replies 311 ·
11
Replies
311
Views
169K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 13 ·
Replies
13
Views
10K
  • · Replies 1 ·
Replies
1
Views
4K