Recent papers
[1]
vixra:2601.0116 [
pdf]
Greggle & Gruggle: Composable Regular Path Queries and Graph Manipulation
Greggle is a small query language and tool for performing regular path queries over labelled directed graphs. Gruggle is a companion Node.js utility for ingesting, merging, inspecting, and lightly manipulating graphs in the Graphviz dot format. Graphviz is a widely used system for graph visualization; its dot language is simple to author and makes it easy to view results with standard Graphviz tools. Together the two utilities provide a frictionless workflow: Gruggle builds, merges, filters, and styles graphs; Greggle answers expressive path queries with edge—level predicates; and Gruggle can consume Greggle’s annotations (e.g., find-path) to visualize witnesses. This document presents both tools, why they are complementary, and how they can be used jointly in analysis and visualization tasks.
[2]
vixra:2601.0115 [
pdf]
Physical Characteristics of Geodesics For A Yilmaz Point Mass Metric
Since Schwarzschild’s first solution of the Einstein field equations, the simple model of a single, point mass gravitating source has encompassed an impressive array of phenomena that have provided confirmation for Einstein’stheory of General Relativity. These include gravitational time dilation and spectral redshifts, gravitational refraction of light, perihelion precession of planetary orbits, innermost stable orbits of accretion disks and, recently, the shadows of the photon spheres of extremely compact masses. These phenomena are associated with the geodesic motions of material particles or photons in the immediate vicinity of large masses that can be regarded as point sources of gravity. The limited purposes of this article are to present the underlying physicsof the exponential metric of Yilmaz and to demonstrate that it correctly encompasses the observed phenomena. As an isotropic metric, It may be the only one also in accord with the observed isotropy of inertia.
[3]
vixra:2601.0111 [
pdf]
The Soul of Waves: Physical Interpretation of Dispersion Relations
This pedagogical paper presents a comprehensive framework for interpreting dispersion relations across fundamental physical systems. We adopt a novel approach that starts from the mathematical form $omega(mathbf{k})$ and systematically extracts its physical content, rather than deriving it from first principles. Through an in-depth case study of the massive Klein-Gordon dispersion relation $omega^2 = omega_0^2 + c^2k^2$, we demonstrate how this single equation encodes phase velocity, group velocity, density of states, effective mass, and impedance. The analysis reveals the universal nature of this dispersion form, which manifests in quantum fields, plasmas, superconductors, and photonic crystals with different physical interpretations of its parameters. We complement this with detailed examination of classical systems including mass-spring chains and hydrodynamic waves, providing tangible analogies that bridge conceptual understanding between quantum and classical wave phenomena. The paper includes eleven carefully designed figures that visualize key concepts and a comprehensive catalog of dispersion relations in the Appendix. Aimed at advanced undergraduates and instructors, this work emphasizes conceptual understanding through physical interpretation, offering a unified pedagogical framework for teaching wave propagation across physics curricula while maintaining mathematical rigor and depth.
[4]
vixra:2601.0110 [
pdf]
Synthetic Aperture Radar Point Target Response
The algorithms for computing the point target response in Synthetic Aperture Radar (SAR) will be presented. The target modeling and simulations will be performed following the procedure developed by McDonough, et. al. (1985) [1] for SEASAT. Simulation results will be provided through the block diagram modeling of the SAR system with Capsimtextsuperscript{textregistered}[2]. The SAR project has been a part of the Capsimtextsuperscript{textregistered} distribution since 1990. The research on SAR was conducted by the author while a Professor at NC State University in 1987. The GitHub repository was created in November 2025.
[5]
vixra:2601.0102 [
pdf]
Proving the Formal Inconsistency of Special Relativity
The relativistic contraction of distances in the direction of relative motion is used here to formally deduce a potentially infinite number of violations of the Second Law of the Reflection of Light, violations that are impossible according to the first principle of special relativity. From this impossible, and therefore false, contraction of distances, the falsity of time dilations and the falsity of phase differences in synchronizations are formally deduced. Thus, special relativity is an inconsistent theory whose inconsistency must be a consequence of one of its two fundamental principles, the second principle being the only one that can be false, since the first establishes the universality of physical laws, without which the observed consistent evolution of the known universe would be impossible.
[6]
vixra:2601.0101 [
pdf]
Addition and Multiplication: Spectral Orthogonality and Innovation in the Arithmetic of Integers
The arithmetic of the integers is governed by two fundamental operations, addition and multiplication, whose interaction lies at the core of many deep problems in number theory. While multiplication preserves prime factorization in a rigid and conservative manner, addition typically destroys multiplicative structure and generates new prime content.In this work, we develop a unified structural framework that explains this asymmetry through spectral and operator-theoretic principles. By embedding the integers into a Hilbert space, we show that multiplication acts as a diagonal, layer-preserving operator in the prime spectral basis, whereas addition acts as a non-local, mixing operator driven by carry propagation. This spectral incompatibility leads to an arithmetic uncertainty principle, forbidding simultaneous localization in additive and multiplicative bases.Building on this structure, we introduce additive innovation as a quantitative measure of the new prime information created by a sum. We prove that the only obstruction to innovation arises from smoothness and $S$-unit phenomena in the coprime core. Using classical results on smooth numbers, we show that additive innovation is typically large, yielding unconditional abc-type inequalities in density.Finally, we develop an information-theoretic perspective, showing that addition produces entropy across prime scales while multiplication remains information-preserving. These results provide a structural explanation for the sum-product phenomenon and reframe classical problems as manifestations of the intrinsic incompatibility between additive and multiplicative spectral structures.
[7]
vixra:2601.0100 [
pdf]
From Algebraic Extension to Physical Law: Multiplication, Integration, and the Emergence of Variational Field Reality
We propose a foundational route from elementary mathematical operations to the structural form of physical law. The guiding thesis is that multiplication is the primitive operation that generates geometric extension (e.g., area via bilinear composition), integration is the continuous accumulation of such local extensions into global quantities, and differentiation (or functional variation) is the dual operation that extracts local constraints from global accumulations. From these principles, we show how any consistent description of ``physical reality'' must be formulated in terms of local densities defined over a continuous geometric support, whose global content is obtained by integration and whose dynamics follows from variational (action) stationarity.Within this operational framework, quadratic field terms arise naturally as the simplest scalar invariants built from local degrees of freedom, while source couplings appear as bilinear products between generalized currents and the underlying deformation variables. Furthermore, we show that quantum entanglement is not a dynamical anomaly but a structural inevitability: additive accumulation acting on states represented in a multiplicative (spectral) basis generically produces global correlations that resist local factorization. This reframes Bell-type violations as a failure of structural independence rather than a signal of superluminal causal influence, thereby preserving relativistic causality at the level of dynamical propagation.Crucially, beyond the contractive modes commonly associated with forces and curvature, the same logic compels expansive degrees of freedom: an entropic sector characterized by an intensive--extensive product structure (temperature-like $times$ entropy-like) contributing intrinsically to the global action. This viewpoint yields a general blueprint for interpreting electromagnetic, gravitational, and entropic responses as projected modes of a common underlying field structure, and it clarifies why concrete realizations of such a blueprint---including quantum-elastic and gravito-entropic field models---arise as minimal, structurally stable completions rather than independent hypotheses.
[8]
vixra:2601.0099 [
pdf]
Quantum-Elastic Geometry: a Unified Framework for Fields and Fundamental Constants of Nature
We present the Quantum-Elastic Geometry (QEG) theory, a unified framework wherein spacetime is modeled as a fundamental, physical substrate with quantum, elastic, and dissipative properties. The state of this medium is described by a single, symmetric rank-2 tensor field, $mathcal{G}_{muu}$, whose dynamics are governed by a generally covariant action. Known physical interactions are shown to emerge as distinct, irreducible deformation modes of this unified field: gravity, electromagnetism, and a new field --denominated "thermo-entropic field"-- that gives rise to irreversible thermodynamics.Furthermore, fundamental constants of nature are shown to be uniquely determined and interrelated by the substrate's properties. We derive the fundamental constants of nature through two distinct yet convergent approaches: (i) from the physical postulates of QEG, assuming the $mathcal{G}_{muu}$ tensor, its properties leading to dimensional collapse ($[M]equiv[L]equiv[T]$), and parsimonious physical principles (e.g., reciprocity, damped equipartition, self-consistency), we deduce specific functional forms for the constants; and (ii) independently, assuming only foundational geometric principles for the substrate (homogeneity, isotropy, covariance, Lorentz invariance) and imposing self-consistency -formalized via a minimal set of geometric normalization conditions consistent with the QEG framework-, we derive the substrate's emergent structure and properties, obtaining precisely the same functional forms for the constants. The outcome is a robust, convergent two-way deductive framework, in which fundamental constants are geometrically enforced, emerging as predictable consequences of a stable and symmetrically constrained geometry.Finally, we show how the theory predicts -among other results- a scale-dependent gravitational coupling derived from a geometric duality in self-energy, which offers a parameter-free resolution to key cosmological tensions, including the Hubble crisis.In summary, QEG provides a coherent and consistent origin for both fields and constants, unifying them as rigorously derived emergent properties of a single, dynamic spacetime substrate.
[9]
vixra:2601.0098 [
pdf]
Quantum-Elastic Geometry: a Nonlinear Substrate Extension for Massive Regimes
We present a minimal nonlinear extension of Quantum—Elastic Geometry (QEG), in which a single symmetric deformation tensor (G_{muu}) and its modal projections underpin the effective long-range sectors of gravity, electromagnetism, and thermo-entropic dynamics. The extension accounts for two additional empirical structures—finite-range interactions and hadronic-scale confinement—without introducing new fundamental fields beyond (G_{muu}). Finite range emerges when selected projected modes acquire geometric masses set by the local curvature of the substrate self-interaction potential, (m_X^2 equiv V_X''(0)), yielding Yukawa/Proca-type propagation. In the genuinely nonlinear regime, quartic (and higher) terms in (V(G)) can energetically favor filamentary minima; under suitable variational constraints, this leads to flux-tube configurations with approximately constant tension and an effective linear energy—separation scaling (confinement-like behavior).Crucially, the framework yields an endogenous classification of particle-like excitations: particles are finite-energy, localized eigenmodes or topologically stabilized defects of the elastic vacuum (G_{muu}), carrying quantized action. Under finite-action boundary conditions and a compact order-parameter sector, the Standard Model taxonomy is reorganized as sectors of the physical configuration space: fermions correspond to nontrivial spinorial or holonomy sectors, bosons to topologically trivial transport modes, leptons to elementary globally extendable defects, quarks to fractional defect configurations obstructed from isolated finite-action completion, and hadrons to closed composites in which obstruction classes cancel. The same construction yields a natural interpretation of generations as discrete radial excitation levels ((k = 0,1,2,ldots)) around a fixed defect topology—e.g., (k=0 to e), (k=1 to mu), (k=2 to tau)—thereby relating mass hierarchies to the spectral structure of a single underlying defect rather than to distinct fundamental species.
[10]
vixra:2601.0097 [
pdf]
Assuming C [ Smaller Than] R*2, the Explicit Abc Conjecture of Baker is True, it Implies the Abc Conjecture is True
In this paper, assuming that the conjecture c [smaller than] R*2 is true, we give the proof that the explicit abc conjecture of Alan Baker is true and it implies that the abc conjecture is true. We propose the mathematical expression of the constant K(epsilon). Some numerical examples are provided.
[11]
vixra:2601.0096 [
pdf]
A Brief Study on Solitaire Modulo 3
In this short article, we will discuss a card game, from now on namely Solitaire modulo 3. After having described how it works, through a probabilistic calculation, we will arrive at determining the probability of victory. In particular, we will use the rook polynomials, which will allow us to finally obtain a closed form for calculating the probability of winning at Solitaire modulo 3. Finally, we will study the case where the number of cards in play is much more greater than the number of constraints present in the game format. Under this assumption, the Solitaire modulo 3 mechanism becomes asymptotically equivalent to a binomial distribution.
[12]
vixra:2601.0095 [
pdf]
Modeling the Social Apathy: Stochastic Dynamics of Opinion Under Contradictory Stimulation
This paper develops a stochastic dynamical model to investigate the psychological impact of exposure to contradictory information, a prevalent feature of modern media ecosystems. We formalize ``contradictory stimulation'' as stochastic noise in a model of emotional state dynamics. Our analysis reveals two key regimes: high-intensity contradiction drives individuals towards textbf{stable apathy}, while specific parameter combinations produce textbf{bimodal polarization}, where psychological states oscillate randomly between euphoria and lethargy. These results provide a mathematical basis for sociological phenomena like anomie and offer a novel mechanism for emergent polarization from a uniform information stream. The study establishes a theoretical framework for generating testable hypotheses about the effects of information chaos on political engagement and psychological well-being.
[13]
vixra:2601.0093 [
pdf]
Does Gravity Care About Electric Charge? A Minimalist Model and Experimental Test
Does gravity care about electric charge? Precision tests of the weak equivalence principle achieve remarkable sensitivity but deliberately minimize electric charge on test masses, leaving this fundamental question experimentally open. We present a minimalist framework coupling electromagnetism to linearized gravity through conservation of a complex charge-mass current, predicting charge-dependent violations $Delta a/g = kappa(q/m)$. Remarkably, this prediction occupies unexplored experimental territory precisely because precision gravity tests avoid charge variation. We identify this as a significant gap and propose a modified torsion balance experiment where $q/m$ is treated as a controlled variable. Such an experiment could test whether gravitational acceleration depends on electric charge, probing physics in genuinely new parameter space. This work exemplifies how theoretical minimalism can reveal overlooked opportunities in fundamental physics.
[14]
vixra:2601.0088 [
pdf]
Curvature-Regulated Infrared Gravity Without New Degrees of Freedom
We present a conservative infrared extension of General Relativity in which late-timecosmic acceleration emerges from a curvature-regulated modification of gravitational timedilation. The framework introduces no additional propagating degrees of freedom and remains fully covariant at the action level. Exponential suppression ensures agreement withall laboratory, solar-system, and strong-field tests of gravity. We provide a detailed mathematical formulation, analyze background and perturbative dynamics, compare with existingobservational constraints, and study the theory across solar, galactic, and cosmological curvature scales. The model reproduces ΛCDM behavior at late times while yielding distinct,testable predictions in ultra-low curvature environments
[15]
vixra:2601.0086 [
pdf]
Engineering Polarization: How Contradictory Stimulation Systematically Undermines Political Moderation
Political moderation, a key attractor in democratic systems, proves highly fragile under realistic information conditions. We develop a stochastic model of opinion dynamics to analyze how noise and differential susceptibility reshape the political spectrum. Extending Marvel et al.'s deterministic framework, we incorporate stochastic media influence $zeta(t)$ and neuropolitically-grounded sensitivity differences ($sigma_y > sigma_x$). Analysis reveals the moderate population---stable in deterministic models---undergoes catastrophic collapse under stochastic forcing. This occurs through an effective deradicalization asymmetry ($u_{B}^{text{eff}} = u + sigma_y^2/2 > u_{A}^{text{eff}}$) that drives conservatives to extinction, eliminating cross-cutting interactions that sustain moderates. The system exhibits a phase transition from multi-stable coexistence to liberal dominance, demonstrating how information flow architecture---independent of content---systematically dismantles the political center. Our findings reveal moderation as an emergent property highly vulnerable to stochastic perturbations in complex social systems.
[16]
vixra:2601.0084 [
pdf]
Clustering Words by Graph based AHC Variants
In this research, we propose and apply the graph based AHC variants to the word clustering. The initial AHC version which clusters graphs was previous proposed as an approach to the word clustering. In this research, we mention the three AHC variants: one where the data clustering proceeds in the bottom-up direction with the similarity threshold, one where it allows any merge of more than two pairs, and one where clusters are merged based on the radius. In this research, we modify the three AHC variants into the graph- based versions, as well as the initial AHC version. As the goal of this research, we improve the clustering performance, by modifying them so.
[17]
vixra:2601.0082 [
pdf]
Hydrodynamic Resolution of the Hubble Tension and Prediction of Chromatic Vacuum Dispersion
The persistent discrepancy between local measurements of the Hubble constant H_0 ~ 73 km/s/Mpc) and values derived from the Cosmic Microwave Background H_0 ~ 67 km/s/Mpc) suggests a fundamental incompleteness in the LambdaCDM model. We propose a solution based on dissipative wave mechanics within a viscous continuum. By introducing a non-vanishing kinematic viscosity nu to the vacuum substrate, we demonstrate that cosmological redshift is a non-linear function of distance, induced by Taylor-Couette-like dissipation rather than metric expansion. Numerical fitting against 2026 data from Cosmic Chronometers and JWST-JADES reveals that a single viscous parameter resolves the tension. Furthermore, we derive a falsifiable prediction: a Vacuum Dispersion (CVD), implying that redshift is frequency-dependent (d z / d omega > 0). This effect is testable with current lensed supernova observations.Keywords: Hubble Tension, Vacuum Viscosity, Dissipative Cosmology, Dark Energy, Alternative, Chromatic Dispersion, Hydrodynamic Spacetime
[18]
vixra:2601.0081 [
pdf]
Hierarchical and Tiny Recursive Models for Medical Image Captioning
Recent advancements in Hierarchical Reasoning Models (HRM) have demonstrated strong capabilities in complex algorithmic and abstract reasoning tasks by mimicking multi-timescale cognitive processes. In this work, we extend this architecture to medical image captioning, introducing specific ImageHRM variants. Furthermore, we explore a radical simplification of this paradigm: the Tiny Recursive Model (TRM). Challenging the necessity of complex dual-loop biological hierarchies, TRM employs a single "tiny" network (7M parameters) that recurses deeply to achieve superior generalization. We introduce ImageTRM, which adapts this "Less is More" philosophy to vision-language tasks. Our experiments on ROCOv2 show that while the Triple-Loop FuseLIP ImageHRM achieves stateof- the-art results, the tiny ImageTRM with a Swin backbone surprisingly outperforms it, demonstrating that deep recursive reasoning with high-quality visual features can surpass larger, more complex architectures.
[19]
vixra:2601.0080 [
pdf]
Infinitely Algebraic Classes
We show that on a complex projective manifold $X$, for $mathbb G=mathbb R$ or $mathbb Q$, a class in $H^{p, p}(X;mathbb Z)otimes mathbb G$ is represented by a convergent infinite series of integration currents over algebraic cycles with real coefficients. It implies that a Hodge class is represented by an algebraic cycle with rational coefficients.
[20]
vixra:2601.0077 [
pdf]
Joint Prediction of Watch Ratio and Skip Behavior in Recommendation System
This study examines user engagement with online video content using a multi-task learning approach. In this study, we combine viewing histories, basic user attributes, and content datasets from several public sources to predict both the proportion of a video watched and whether a user skips a video. The two tasks are learned jointly, using a shared representation with separate outputs for regression and classification. Several common multi-task architectures are evaluated and compared under the same experimental setup. Techniques like Multi-Gate Mixture-of-Experts (MMoE), and Progressive Layered Extraction (PLE), and cross stick network were employed. Results of this study on a held-out test set show that watch ratio can be predicted with reasonable accuracy, while skip prediction remains challenging and only marginally better than random guessing. Differences between model architectures are small, suggesting that data size and label definition might have a stronger influenceon performance than model choice. These findings highlight the difficulty of modeling discreteengagement outcomes from noisy behavioral data and point to the importance of careful labelconstruction in future work. Especially, this study highlights the challenges of prediction of skip prediction due to likely reason of subjectively setting the threshold.
[21]
vixra:2601.0076 [
pdf]
From Data Pipelines to AI Outcomes: Quantifying the Impact of Data Engineering Decisions on Machine Learning Reliability
The reliability and performance of machine learning (ML) systems in production dependcritically on data engineering decisions made throughout the pipeline lifecycle. This compre-hensive technical review synthesizes ndings from 434 peer-reviewed publications spanning20182026 to quantify how upstream data collection, mid-stream preprocessing and featureengineering, and downstream versioning and monitoring decisions impact ML outcomes.We examine production systems across cybersecurity, healthcare, nance, and cloud-nativeplatforms, analyzing technical frameworks including Apache Kafka, Kubeow, MLow, andemerging feature stores. Our analysis reveals that data quality issues account for 6080% ofML system failures in production, with data engineering decisions inuencing model accu-racy by up to 40 percentage points. We identify critical decision points across the pipeline,quantify their impacts through empirical evidence, and provide actionable frameworks forpractitioners. Key ndings include: (1) streaming architectures reduce latency by 10100Öwhile maintaining accuracy within 25% of batch systems; (2) automated data validationcatches 7090% of quality issues before model training; (3) feature stores reduce feature engi-neering time by 5070% while improving consistency; and (4) comprehensive lineage trackingenables 35Ö faster debugging of production failures. This review establishes data-centricAI as essential for reliable ML systems and identies critical gaps in cost-benet analysis,cross-domain generalization, and standardized impact metrics.
[22]
vixra:2601.0072 [
pdf]
On the Role of Measurement Events in Non-Equilibrium State Formation
This paper examines the hypothesis that measurement events function as generative operations rather than passive observational processes in the formation of observable states. Preliminary theoretical analysis suggests measurement interactions may constitute the fundamental mechanism by which potential states transition to actualized configurations across quantum and relativistic regimes.Initial exploration indicates similar generative dynamics may operate in information processing systems, thermodynamic state transitions, chemical reaction pathways, neural signal propagation, developmental gene expression, evolutionary selection events, market transaction execution, material phase boundaries, computational proof verification, and distributed consensus protocols. The commonality appears to lie in the discrete, event-based character of state actualization rather than continuous revelation of pre-existing conditions. The present hypothesis is intended as a unifying statement regarding the ontological role of discrete interaction events in state realization, independent of domain-specific implementations.This work presents the foundational hypothesis without detailed mathematical formalism. The author proposes that action -understood operationally through measurement interaction -serves as a cross-domain generative principle. Specific mechanistic treatments and quantitative predictions will be addressed in subsequent publications.
[23]
vixra:2601.0065 [
pdf]
Importance Sampling and Contrastive Learning Schemes for Parameter Estimation in Non-Normalized Models
Likelihood-approximation methods and contrastive learning (CL) are two prominent approaches for inference in models with unknown partition function. In this work, we provide a detailed comparison between the likelihood approximation by Geyer's approach (GA) and CL. Rather than increasing the complexity of Geyer's method to enable comparison, as proposed in [1], we adopt the opposite strategy by simplifying CL. We introduce a class of IS-within-CL schemes that estimate the partition function via importance sampling (IS) and reduce the optimization problem to the original parameter space. This perspective motivates the development of novel variants, whose theoretical properties are analyzed and empirically compared in a replicable experimental study. The described IS-within-CL schemes yield an entire approximation of the partition function, so enabling a possible efficient Bayesian inference. An optimal independent proposal density for IS-within-CL methods and the GA is also introduced. Overall, this work contributes to a clearer unification of likelihood-approximation and CL approaches, offering both theoretical understanding and practical tools for inference in energy-based and non-normalized models. Related MATLAB and R codes are also made freely available to help the reproducibility of the results.
[24]
vixra:2601.0056 [
pdf]
Circuit Theoretic Proof of the Non-Existence of Odd Perfect Numbers
We present a novel circuit-theoretic approach to the long-standing problem of the existence of odd perfect numbers. By mapping the divisor structure of an integer $n$ onto a resistive network $Gamma(n)$, we construct a unique reduced Kirchhoff Laplacian $bar{Gamma} in mathbb{Q}^{(n-1) times (n-1)}$. We define the network topology such that backbone edges possess unit conductance and shortcut edges, determined by the divisors of $n$, possess conductances $G_{a,b} = gcd(a,b,n)^{-2}$. Utilizing the properties of M-matrices and the uniqueness of the discrete Dirichlet problem, we establish a fundamental identity relating the potential matrix $bar{Gamma}^{-1}$ to the arithmetic function $sigma_{-1}(n)$. We prove that the integrality of the determinant $|bar{Gamma}|$ is a necessary condition for $n$ to satisfy the perfect number criterion $sigma_1(n)=2n$. Through a $p$-adic analysis of the Schur complement and the application of Chió Pivotal Condensation, we demonstrate that $|bar{Gamma}| in mathbb{Z}$ if and only if $n$ is prime or $n in {4, 6, 9}$. By invoking the Euler form of an odd perfect number $n = p^{4lambda+1}Q^2$ and applying harmonic consistency constraints to the boundary potentials, we derive a parity contradiction in the determinantal cofactors. Specifically, we show that the existence of an odd perfect number requires $|bar{Gamma}|$ to be an even integer, which contradicts the classification of integers for which the Laplacian determinant is integral. We conclude that no odd perfect numbers exist.
[25]
vixra:2601.0053 [
pdf]
Table based KNN Variants for Categorizing Words
In this research, we propose the table based KNN variants, as the approach to the word categorization. The initial KNN version which receives a table as its input data was previously proposed as the tool of such task. In this research, we mention the three KNN variants: one where the selected nearest neighbors are discriminated by their similarities with a novice example, one where the attributes are discriminated by their correlations with the target outputs, and one where the training examples are discriminated by their credits. In this research, we modify the three KNN variants as well as the initial version of the KNN algorithm. As the goal of this research, we try to improve the classification performance bymodifying the KNN variants so.
[26]
vixra:2601.0052 [
pdf]
The Geometric Music of Primes A Toroidal Framework from Polygon Inscription to a Spectral Formulation of the Riemann Hypothesis
Prime numbers have traditionally been studied through the austere lens of arithmetic, yet their deepest structure may be geometric in nature. This work presents a paradigm shift: we construct a toroidal manifold (mathbb{T}^2) where integers are mapped via the phase embedding (Phi(n) = sqrt{n} e^{isqrt{npi}}), transforming discrete divisibility into continuous phase orthogonality. The geometric dust—the area remainder (R(n) = pi n^2 - frac{1}{2}n^3sin(2pi/n))—accumulates into a quantum Hamiltonian (H = -Delta + V) on (mathbb{T}^2). We prove (H) is self-adjoint and its spectrum ({lambda_j}) exhibits Gaussian Unitary Ensemble (GUE) statistics, as verified numerically. Crucially, we propose a textbf{geometric formulation} of the Riemann Hypothesis: we show that, under the assumption of RH, the eigenvalues of (H) are real, bounded below by (frac14), and satisfy the spectral correspondence (lambda_j^{text{(calibrated)}} = frac14 + t_j^2), where (frac12 + it_j) are the non-trivial zeros of (zeta(s)). Numerical verification shows agreement within (0.1%) for the first 50 zeros. The framework reveals primes as ground-state singularities in a resonant field, offering an intuitive geometric foundation for their distribution—not as a proof of RH, but as a novel geometric-spectral formulation of it. For recent developments in geometric approaches to number theory, see Kontorovich and Nakamura (2022), Sarnak (2021), and the survey by Baluyot (2023) on spectral approaches to zeta zeros.
[27]
vixra:2601.0048 [
pdf]
PictoLens: Gaze-Driven Interaction Technique for Layered Data Visualization Exploration
PictoLens is a novel gaze-based interaction technique for exploring layered data visualizations through progressive disclosure. Thesystem uses real-time gaze data to implement a point-and-click interaction model. Through intuitive gestures such as ‘Gaze and Fixate’and ‘Gaze and Lean In,’ users can seamlessly interact with three representations of the data: an AI-generated pictograph, a scatter-plotvisualization, and an annotated scatter-plot visualization. This hands-free and voice-free interaction technique addresses key challengesof traditional data exploration, such as long dwell times and the Midas Touch problem. PictoLens uses intuitive metaphors fromeveryday gestures: the gaze serves as a pointer, moving the visualization lens. Fixating the gaze at a point on the pictograph unlocks afiner data representation, while leaning forward reveals the most granular, detailed visualization layer with annotations. We presentPictoLens’ design and implementation to demonstrate its potential as an immersive analytics tool and interaction technique.
[28]
vixra:2601.0046 [
pdf]
Theory of Quadratic Triadic Relations for Prime Numbers
This paper proposes a new structural approach to the study of consecutive prime numbers based on a quadratic relation linking three successive primes. A stability ratio is introduced and shown to converge asymptotically to unity using explicit bounds for the k-th prime number. This convergence induces a constraint on the local variation of prime gaps, leading to an asymptotic smoothness law for their relative fluctuations. The analysis is fully deterministic and avoids heuristic arguments based on average asymptotic. Numerical validations using verified large prime datasets confirm the theoretical predictions and illustrate the progressive regularization of local gap variations as the prime index increases.
[29]
vixra:2601.0045 [
pdf]
HelpSteer Transformer: Attribute-Conditioned Language Model with Architectural Innovations
Large language models have demonstrated remarkable capabilities across diverse natural language tasks, yet controlling their output characteristics remains challenging. We present HelpSteer Transformer, an attribute-conditioned language model architecture designed for training on the HelpSteer dataset. The model incorporates modern architectural innovations including Rotary Position Embeddings (RoPE), SwiGLU activation functions, and RMSNorm, enabling fine-grained control over five response attributes: helpfulness, correctness, coherence, complexity, and verbosity.The model contains approximately 60 million parameters across eight transformer layers and is designed for efficient scaling while maintaining high-quality text generation. An explicit attribute conditioning mechanism integrates user preferences directly into the generation process, enabling dynamic control of outputs without requiring separate fine-tuning for different attribute combinations. Architectural analysis and preliminary experiments indicate competitive performance relative to larger baseline models, while maintaining lower computational cost. This work highlights the effectiveness of architectural conditioning for controllable and efficient language model design.
[30]
vixra:2601.0044 [
pdf]
Stochastic Geometric Gravity: A Self-Consistent Framework for Gravitational Fluctuations
We present a stochastic geometric framework for gravity, starting from the Gravitational Balance Equations (GBE)~cite{lavenda} which arise from varying the Einstein-Hilbert action with respect to sectoral scale factors in a doubly-warped spacetime. The extrinsic curvature is promoted to a random field, and a moment hierarchy is derived from the GBE. A geometric projector closure maps second moments to an effective fluctuation curvature, yielding closed mean equations without ad-hoc stress tensors. The fluctuation energy obeys a generalized Bochner formula, linking geometric dissipation to the mean extrinsic curvature and the intrinsic curvature of the leaves. This approach provides a self-consistent probabilistic description of gravitational fluctuations, revealing that classical general relativity is not a fundamental deterministic theory but rather the first-moment truncation of a more complete stochastic geometric description. In particular, the so-called ``exact'' vacuum solutions of Einstein's equations--such as Schwarzschild--are not exact; they are mean-field approximations that neglect the essential nonlinear term (K_{AB}K^{AB}) and all higher fluctuations. This neglect becomes manifest in regimes beyond the photon sphere ((G<3M)), where the classical hierarchy of terms breaks down and the mean-field description yields unphysical results.
[31]
vixra:2601.0042 [
pdf]
Word Categorization with KNN Variants Considering Feature Similarity and Feature Value Similarity
In this research, we propose the three KNN variants which considers the feature similarity, as the approaches to the word categorization. The initial version of the KNN algorithm which does so was previously proposed as the tool of the task. We mention the three KNN variants: one which discriminates its selected nearest neighbors by their distances, another which does attributes by their correlations with the target outputs, and the other which does the training examples by their credits. The feature similarity is applied to the three KNN variants as well as the initial version. The classification performance is improved by applying the feature similarity to the KNN variants as the improved KNN versions.
[32]
vixra:2601.0033 [
pdf]
Four Index Einstein Field Equations with Quantum Like Effects from Pure Geometry and CPT Symmetry Addition to Metric Tensor
In this work I present extensions of Einstein field equations cite{1} into four index equations. This extensions give as natural result a energy tensor for vacuum thus for gravity field. It's all construct in spirit of two index field equations and in truth does not need any additional assumptions about field equations. Form it follows that it's natural completeness of two index equations not a true extension as it fully defines curvature tensor not only Ricci part of curvature as it happens in two index equations. In next parts of work I add CPT symmetry cite{2} into equations and make arguments about it's only possible extensions of metric tensor cite{1}, additionally I add interpretation about gluing manifolds in certain way, from them follows how to avoid CTC cite{5} and those do not care about singularities in solutions. Finally at last part of work I add quantum like effects from pure geometry without invoking any quantization of field. Those effects are divided into two parts, one is about wave function like object and measurement, next one is about spin as orientation of manifold. Wave function like object is constructed from normalized curvature invariant. That plays role of "probability" of finding object in given volume of spacetime at given interval of time. I did no present direct solutions to those equations or concrete examples where it differs from General Relativity cite{1}.
[33]
vixra:2601.0026 [
pdf]
Separation, Desire and Time of Waiting
In his Parmenides, Plato subjects two kinds of One to dialectical examination: the absolute One, without parts, which is neither in space nor in time, nor does it have being, and the One that is being, and therefore is the whole that has parts. These are two totally different Ones, two mutually transcendent worlds. Each of the two, considered independently of the other, ultimately proves to be aporetic within the narrow horizon of the act, within whose limits the thought of the Platonic dialogue is exhausted.However, by extending the ontological horizon to the sphere of potentiality, both, united, constitute the structure of Intention, which binds an "I" to its other. In Intention, the "I" does not exist without being, thanks to which it has a soul and a consciousness, and being makes no sense without the "I".The purpose of this article is to clarify this relationship between the absolute one, the "I", and the one of being, the whole, whose synthesis is the person, and to show that Intention, a true theory of everything, integrates both physical reality and aspects inherent to consciousness and the constitution of the I within a single explanatory framework.The distance of separation is reflected in the time of waiting in the mirror that is the three-dimensional space of Intention, and thus of the universe, as well as of every whole that is part of it. A mirror whose substance is desire and in which the Other is revealed.
[34]
vixra:2601.0025 [
pdf]
Algorithmic Resilience Under Resource Constraints: The Novosibirsk School and the Method of Fractional Steps (1955—1975)
This article examines the development of operator splitting methods in Soviet numericalanalysis during 1955—1975, with particular focus on N.N. Yanenko’s formalization of theMethod of Fractional Steps at the Siberian Branch of the USSR Academy of Sciences. Whilesimilar techniques were independently developed in the West (Peaceman-Rachford 1955,Douglas-Rachford 1956), the Soviet school pursued a distinct trajectory shaped by acutehardware constraints and deep epistemological commitments to operator theory. Throughanalysis of technical publications, archival materials, and comparative historiography, thisstudy argues that material scarcity catalyzed a systematic research program emphasizingcomputational economy, while a pre-existing mathematical culture valorizing theoreticalelegance reinforced this trajectory. The case illuminates how geopolitical constraints andintellectual traditions jointly shaped algorithmic innovation, contributing to methods that ironically became foundational for modern massively parallel computing. Significant archival gaps limit definitive claims about industrial applications, highlighting the need for further primary source research.
[35]
vixra:2601.0024 [
pdf]
Visualising Fermat’s Last Theorem with Proofs Within Infinite Families of Pythagorean Triples
To support intuitive understanding of Fermat’s Last Theorem, this paper presents a simple visualisation based on a defined normalised Fermat plot and shows that rational directions arising from succession t Pythagorean triples—with a fixed hypotenuse gap—become automatically irrational beyond a finite point, explaining why no Fermat type integer solutions can occur along these directions.
[36]
vixra:2601.0023 [
pdf]
The Dark Sectors and the Hubble Tension as Entropy Phenomena: A Thermodynamic Extension of General Relativity at Cosmological Scales
Modern cosmology is characterized by two persistent conceptual challenges: the dominance of the so-called dark sector and the unresolved Hubble tension. Within the standard cosmological framework, dark matter and dark energy are introduced as independent components to reconcile General Relativity with observations at galactic and cosmic scales, while the Hubble constant is assumed to be a single global parameter describing the expansion of the universe. Despite their empirical success, these assumptions lack a clear physical interpretation and have led to growing internal inconsistencies.In this qualitative and theoretical essay, we argue that both the dark sector and the Hubble tension originate from a common source: the incomplete incorporation of thermodynamic laws—particularly entropy—into gravitational cosmology. General Relativity is shown to be a fundamentally local theory of gravity, rigorously conserving energy--momentum in local spacetime regions, but insufficient to describe the global evolution of an expanding, non-equilibrium universe. At galactic, cluster, and cosmic scales, the neglect of entropy production, entropy gradients, and horizon thermodynamics manifests observationally as phenomena attributed to dark matter and dark energy.Within the textit{Thermodynamic Relativistic Gravity - Classical General Relativity} (TRG--CGR) framework, the dark sector is reinterpreted as an entropy sector: dark matter emerges as entropy-induced geometric response (``entropy matter''), while dark energy corresponds to entropy-driven cosmic acceleration (``entropy energy''). We further argue that cosmic geometry is fixed, while the spacetime metric evolves through matter--energy redistribution, entropy growth, and the thermodynamic arrow of time. This evolution naturally leads to a scale-dependent expansion rate, providing a conceptual resolution of the Hubble tension and related anomalies in cosmic age, structure growth, and gravitational lensing amplitude.This work does not propose new exotic substances or violate the well-tested local predictions of General Relativity. Instead, it offers a thermodynamically complete reinterpretation of cosmological phenomena, positioning the dark sector and the Hubble tension as emergent consequences of entropy in an evolving universe. The essay aims to clarify the conceptual foundations of cosmology and motivate future quantitative developments of thermodynamic gravity at large scales.
[37]
vixra:2601.0021 [
pdf]
Lightweight Cryptographic Instruction Set Extension on Xtensa Processor
Abstract—We describe a lightweight RISC-V ISA extension for AES and SM4 block ciphers. Sixteen instructions (and a subkey load) is required to implement an AES round with the extension, instead of 80 without. An SM4 step (quarter-round) has 6.5 arithmetic instructions, a similar reduction. Perhaps even more importantly the ISA extension helps to eliminate slow, secret-dependent table lookups and to protect against cache timing side-channel attacks. Having only one S-box, the extension has a minimal hardware size and is well suited forultra-low power applications. AES and SM4 implementations using the ISA extension also have a much-reduced software footprint. The AES and SM4 instances can share the same datapaths but are independent in the sense that a chip designer can implement SM4 without AES and vice versa. Full AES and SM4 assembler listings, HDL source code for instruction’s combinatorial logic, and C code for emulation is provided tothe community under a permissive open source license. The implementation contains depth- and size-optimized joint AES and SM4 S-Box logic based on the Boyar-Peralta constructionwith a shared non-linear middle layer, demonstrating additional avenues for logic optimization. The instruction logic has beenexperimentally integrated into the single-cycle execution path of the "Pluto" RV32 core and has been tested on an FPGA system.
[38]
vixra:2601.0019 [
pdf]
Merry-go-Round and Time-Dependent Symplectic Forms
In the merry-go-round fictitious forces are acting like centrifugal force and Coriolis force. Like the Lorentz force Coriolis force is velocity dependent and, following Arnold, can be modeled by twisting the symplectic form. If the merry-go-round is accelerated an additional fictitious force shows up, the Euler force. In this article we explain how one deals symplectically with the Euler force by considering time-dependent symplectic forms. It will turn out that to treat the Euler force one also needs time-dependent primitives of the time-dependent symplectic forms.
[39]
vixra:2601.0018 [
pdf]
On the Ether Problem in Physics: A Heuristic Viewpoint
Inspired by Mach’s philosophical standpoint, Einstein constructed the theory of special relativity, which has been shown to be reliable both theoretically and experimentally. However, the negative conclusion regarding the absolute equivalence of relatively moving inertial frames, as suggested by the twin paradox thought experiment, has not been explicitly reflectedat the level of physical theory. The present work attempts to address this issue and includes the following investigations:(1) a reconsideration of the ether problem; (2) derivations of the mass—energy relation and centrifugal acceleration based on an ether contraction framework; (3) a heuristic interpretation of the invariance of the speed oflight and inertial forces. It is hoped that this work may offer some conceptual insight to readers interested in this problem.
[40]
vixra:2601.0017 [
pdf]
Emergent Vacuum Energy from Apparent-Horizon Thermodynamics
Recent high-precision cosmological observations have revealed statistically significanttensions between early-universe inferences and late-time measurements, most notably in theHubble constant H0 and the clustering amplitude parameter S8. These discrepancies mayindicate limitations of the standard ΛCDM framework when extrapolated across cosmicepochs. In this work, we develop a thermodynamically motivated cosmological model in which the dark energy component is not introduced as a fundamental constant, but instead emerges dynamically from the thermodynamics of the apparent horizon. By applying Hayward’s unified first law in conjunction with the Clausius relation to the cosmological apparent horizon, we derive a self-consistent evolution equation for the Hubble parameter H(z). Numerical integration of the resulting evolution law yields a present-day expansion rate H0 ≃ 71.0 kms−1 Mpc−1, which lies between cosmic microwave background inferences and local distance ladder measurements. The model further predicts a present-day matter density Ωm,0 = 0.2677 and a clustering parameter S8 = 0.781, both of which are consistent withrecent weak lensing constraints. ...notably in the Hubble constant H0 [2] and the clusteringamplitude parameter S8 [8]. These results suggest that horizon thermodynamics may provide a viable mechanism for generating an effective dark energy component, and that the observed cosmological tensions could reflect an incomplete thermodynamic description of the cosmic expansion history rather than the need for new fundamental fields.
[41]
vixra:2601.0013 [
pdf]
The Illusion of Competence: Defining "Epistemic Debt" in the Era of LLM-Assisted Software Engineering
The integration of Large Language Models (LLMs) into the software development lifecycle represents a shift from constructive programming to curated programming. While current metrics focus on productivity gains and syntactical correctness, this paper argues that these metrics are insufficient to capture the long-term systemic risks introduced by AI.We propose the concept of Epistemic Debt: the divergence between the complexity of a software system and the developer’s cognitive model of that system. Unlike traditional Technical Debt, which is often a conscious trade-off, Epistemic Debt is an invisible accumulation of "unearned" code that functions correctly but lacks ahuman owner who understands its causality. This paper provides a theoretical framework for this phenomenon, classifies the specific rchitectural erosions caused by stochastic code generation, and proposes a "Cognitive Ratchet" methodology to mitigate the collapse of maintainability.
[42]
vixra:2601.0008 [
pdf]
Exercises and Problems with Solutions: Astronomy, Geodesy, Celestial Mechanics and Least Squares Theory, for Geomatics Students
This is my second book. It includes 70 exercises and problems with solutions in astronomy, geodesy, celestial mechanics and least squares theory for geomatics students.
[43]
vixra:2601.0004 [
pdf]
Symmetry Breaking and Exclusive Duality: Foundations for a Unified Theory of Structures
This paper investigates the logical-mathematical foundations of physical reality, proposing amodel based on the persistence of symmetry breaking from the real to the complex domain.We postulate the existence of two fundamental structures: the Internal Structure S(O),defined in Hilbert Space, and the External Structure S(O^-1), defined in the complex field.The theoretical core of the work lies in identifying two mutually exclusive regimes of accessto reality: the state of Observation (Potential Infinity) and the state of Understanding(Actual Infinity). We demonstrate that phenomenal reality and logical reality are not static, but the result of a continuous high-frequency exchange between cardinality increment and complex rotation. Furthermore, we hypothesize that such rotation is governed by a metric compatible with the Riemann Hypothesis, linking the distribution of quantum weights tothe nature of prime numbers.
[44]
vixra:2601.0003 [
pdf]
A Falsifiable Effective Framework for Superfluid Vacuum Dynamics in Cosmology and Galaxies
We propose a theoretical cosmology framework in which the classical spacetime manifold is reinterpreted as an emergent superfluid vacuum, described by a Bose--Einstein condensate governed by a nonlinear textit{logarithmic Schr"{o}dinger equation} (LogSE). In this two-phase picture, the homogeneous ground-state of the condensate (Phase A) gives rise to cosmic acceleration (dark energy) through its negative pressure and exhibits a small bulk viscosity that can reconcile disparate measurements of the Hubble constant. Meanwhile, excited states of the condensate (Phase B) form quantized vortices and density solitons that behave as dark matter halos in galaxies. We derive the effective fluid dynamics of this superfluid vacuum, showing that it naturally yields a cosmic equation-of-state $w approx -1$ on large scales and MOND-like phenomenology on galactic scales, without requiring unknown particle species. We demonstrate that quantum pressure from the LogSE resolves the core--cusp problem by stabilizing galactic cores, and that the logarithmic self-interaction allows halo core sizes to be decoupled from the particle mass, avoiding the Catch-22'' that plagues fuzzy dark matter. The framework is confronted with observations: it passes current cosmological tests and galactic rotation curve data, while making distinct, falsifiable predictions. In particular, Lorentz invariance emerges only as a low-energy symmetry of the superfluid vacuum, implying an energy-dependent vacuum refractive index at high energies. We discuss how precision multimessenger timing (e.g., GW170817) and ultra-high-energy gamma-ray observations (e.g., LHAASO detection of GRB~221009A) place stringent constraints on any such Lorentz-violating dispersion. Upcoming astronomical surveys and particle experiments will further test this unified dark'' sector framework.
[45]
vixra:2601.0002 [
pdf]
Length Expansion: A Prerequisite for Understanding the Principle of the Constancy of the Speed of Light and Various Relativistic Paradoxes
The second principle of relativity, stating that the speed of light is constant re-gardless of the source’s velocity, remains incompletely understood.Moreover, thespeed of light is incompatible with length contraction.Beyond this, relativity stillcontains many thought experiments that are difficult to comprehend. These includeBell’s spaceship paradox, the muon paradox, the Supplee submarine paradox, andthe Ehrenfest paradox. The commonality among these problems is that logical con-tradictions arise during the application of length contraction. Since these problemsstem from length contraction, approaching them with length expansion logically re-solves all issues. This article examines whether length expansion resolves this seriesof problems.
[46]
vixra:2512.0150 [
pdf]
The 3D Fractal Superset Which Contains the Mandelbrot Set Without Complex Numbers
In this paper we will see that each vector of the 3D Euclidean vector space can be expressed with operations involving rotations of the unit vector of the x-axis. Thanks to that, we will define a new multiplication between vectors which is analogous to what we have seen in our previous paper viXra:2510.0152 without complex numbers. This operation will allow us to construct a 3D fractal set which contains the Mandelbrot set in the planes OXY and OXZ. We will show some cross sections of other parts of that 3D fractal set.
[47]
vixra:2512.0146 [
pdf]
Some Experiments on Electron Scattering from Atomic Lattice
This study has considered certain aspects of the dynamics of material particles (electrons, neutrons) during their interaction with nuclei of target atoms, the target being an ordered atomic lattice. We took into account factors affecting the particle trajectory, namely, the inverse-square law (Coulomb’s law), physical collisions of both the elastic and inelastic character, and effect of velocity decrease accompanied by bremsstrahlung. Analysis of the obtained results of mathematical simulation of the material particle scattering from the atomic lattice allowed us to reasonably assert that the particles do not possess wave properties giving rise to interference or diffraction. We have proposed a technique allowing practical demonstration of the absence of wave properties in electrons as well as in other material bodies.
[48]
vixra:2512.0142 [
pdf]
Score-Based Graph Generative Models with Sublinear Spectral Density Estimation
We consider score-based generative models for graphs and propose to enhance them with a sublinear-time spectral density estimationmodule. Our method computes a compact spectral summary of the graph Laplacian via randomized Chebyshev moments, and uses thissummary to condition the latent diffusion process and its noise schedule. This yields a spectrum-aware score-based graph generativemodel that can adapt its diffusion dynamics to the structural properties of the input graphs, while avoiding expensive eigenvaluedecompositions
[49]
vixra:2512.0134 [
pdf]
Fast Method for Solving the Minimal Overlapping Circle Expansion Problems
In this paper, we first introduce the Minimal Overlapping Circle Expansion (MOCE) problem. Solution to such a problem has real-world applications, such as finding the location to best communicate with a number of wireless devices, finding the quickest way for a number of vehicles to get to a rendezvous location etc. We present several algorithms to compute the solution with different running time and accuracy. The first uses enclosing square to get an approximate solution; the second only considers pair-wise overlap to approximate; the third uses the the results of the first two and a few other methods to speed up the computation. Our results show that (1) the approximate algorithm can be 1000 times faster than the accurate algorithm, and get to 99.9% of the correct value. (2) improvements can cut down the compute time by 50% for the accurate algorithm.
[50]
vixra:2512.0132 [
pdf]
Lifelong Preference Learning with Composable Diffusion Models on Edge Devices
Enabling lifelong learning in robots requires models that can continuously adapt to evolving tasks, environments, and user preferences while operating under strict computational and privacy constraints. We propose a framework for robot lifelong learning with composable diffusion models on edge devices where complex robot behaviors are represented as compositions of lightweight diffusion modules trained incrementally over time. Each module captures a reusable skill, preference, or environmental dynamic, and compositions are formed through learned conditioning and guidance mechanisms without retraining the full system. To support on-device deployment, we introduce parameter-efficient adaptation strategies and selective memory replay that bound compute, memory, and energy usage on edge hardware. The resulting system mitigates catastrophic forgetting, enables rapid skill recombination, and preserves data locality by keeping learning and inference fully on-device.
[51]
vixra:2512.0131 [
pdf]
Looking Back at Quantum Mechanics via the Quantity Momentum
By analyzing the historical treatments of quantity momentum in different models, we revisit certain fundamental problems in quantum mechanics, e.g. how to understand quantum interference and quantum scattering. Logically, by taking over the physical picture of dressing cloud surrounding hadrons and applying it to the electromagnetic field dressed by charged particles, we find a deeper understanding of the microscopic property of electromagnetic field. It implies that the amplitude of momentum could be closely related to the oscillation of electromagnetic field dressed by charged particle. Consequently the definition of canonical momentum turns out to be interaction dependent.
[52]
vixra:2512.0127 [
pdf]
Global Existence and Smoothness of The Navier-Stokes Equation via Spectral Decimation on Icosahedral Manifolds
Since the original formulation of the Navier-Stokes equations in 1822, the inability to prove global regularity has been fundamentally rooted in a physical misconception: the assumption that the fluid continuum is isotropic at the dissipation scale. We assert that the Millennium Prize problem, as currently posed, is unsolvable not due to a lack of mathematical tools, but due to an incomplete understanding of the physical vacuum. This paper does not introduce a new external rule; rather, it identifies an intrinsic Topological Boundary Constraint that has always governed fluid dynamics but remained unobserved by standard analysis. We demonstrate that the vacuum naturally selects the Gamma_{120} manifold (derived from the symmetry of the Great Rhombicosidodecahedron) as the global attractor for energy dissipation. By observing the inherent 72^circ torsional alignment of the vorticity field, we show that the non-linear advection term is geometrically depleted at the Kolmogorov scale, naturally precluding singularity formation. Finally, we show that the standard isotropic model violates the Second Law of Thermodynamics via spectral aliasing, a violation that nature corrects through this pre-existing geometric governor. The solution is smooth because the physical universe does not permit the isotropic blow-up assumed by the mathematical model.
[53]
vixra:2512.0126 [
pdf]
A Recursive Discrete-Rotation Framework for Waveform Reconstruction and Computational Geometry
"This paper introduces a novel recursive framework for approximating circular geometry and waveforms using discrete segment rotations. Traditional analytic methods, such as the classical circumference formula $C=2pi r$, rely on continuous functions that abstract away the geometric essence of rotation and introduce computational inefficiencies in discrete digital environments. By re-evaluating the 'Method of Exhaustion,' this work derives an original Discrete Radius Formula ($r = frac{C}{2n sin(Deltatheta/2)}$) that eliminates the inherent path-drift found in standard step-based systems. A recursive update algorithm is developed to reconstruct complex signals with $O(1)$ computational complexity, transforming global trigonometric evaluations into local iterative additions. Numerical validation demonstrates high-precision convergence to continuous limits, achieving an absolute error of approximately $7.97 times 10^{-9}$ at high resolution. The results establish a robust bridge between classical geometry and modern digital implementation, offering significant improvements in speed and accuracy for robotics, AI graphics, and signal processing."
[54]
vixra:2512.0125 [
pdf]
Group Theory Ideas With a TI-84 CE
We give some central ideas of abstract algebra in a motivated manner starting with the construction of the integers with straight edge and compass, extrapolating axioms for these integers, finding a finite version of integers that obey these same axioms, and comparing this result with a permutation group via a Cayley table constructed using a TI84 program. Along the way we show how Lagrange, Euler, and Fermat theorems can be motivated and proven as natural results of the development. The need for and the essence of abstraction in mathematics we hope emerges.
[55]
vixra:2512.0123 [
pdf]
Parity-Sector Signatures in Ultraweak Photon Emission from Driven-Dissipative Majorana Spin Systems
We present a minimal driven—dissipative model in which a long-lived spin-correlatedfermionic subspace is represented by Majorana operators and a Z2 parity, and couples to the electromagnetic field through dipolar and spin—orbit—assisted interactions. Parity-sensitive relaxation channels imprint the internal sector onto emitted photons, producing polarization- and helicity-resolved structure beyond generic luminescence. Using a Lindblad master equation with periodic modulation, we perform numerical simulations and compute polarization-resolved emission spectra, Floquet sidebands, and photon correlations g(2)(τ ). The model predicts magnetic-field-dependent polarization asymmetries, drive-locked sidebands, and polarization cross-correlators accessible with polarization-resolved Hanbury Brown—Twiss detection. These signatures provide falsifiable discriminants for assessing whetherMajorana-parity dynamics can contribute to reported ultraweak photon emission.
[56]
vixra:2512.0116 [
pdf]
Design and Control of an Arduino-Based Multifunctional Robotic Car Using Smartphone Applications
The present work focuses on a multifunctional Arduino-based smart robotic car capable of a range of functionalities within the category of advanced control, automation, and interactivity. Wireless communication is achieved by Bluetooth, voice control through the MIT App Inventor interface, obstacle detection using ultrasonic and infrared sensors, and manual operation through a remote controller and smartphone application. The vehicle is driven by DC (BO) geared motors, controlled by an L298N motor driver connected to an Arduino UNO microcontroller. In this context, wireless communication is enabled by the use of an HC-05 Bluetooth module that allows both manual and voice-commanded navigation. The developed system with an HC-SR04 ultrasonic sensor combined with IR sensors offers obstacle avoidance capability with reliable environmental awareness. The robotic platform provides line-following and obstacle-avoiding features while remaining IR remote controllable. In this work, we demonstrate a seamless integration of hardware and software, resulting in a versatile platform for educational, research, and hobbyist applications in robotics and IoT.
[57]
vixra:2512.0114 [
pdf]
Non-Colliding Path Authorisation with Epoch-Based Liveness
present Non-Colliding Path Authorisation (NCPA),a lightweight authorisation protocol in which access rights are encoded as single-use, ordered paths through a system graph. Each authorization must be exercised sequentially, without replay, andwithout colliding with other concurrent authorizations. To ensure liveness, paths are allocated within bounded epochs, allowingsafe reclamation of exhausted resources. Unlike traditional access control systems that rely on centralised locks or cryptographic capabilities, NCPA enforces safety properties through structural constraints and explicit state transitions. I provide an executable specification of the protocol and validate its security properties using property-based testing. Our results demonstrate that NCPA prevents replay, skipping, impersonation, and collisions, while guaranteeing bounded exhaustion and epoch-based recovery.
[58]
vixra:2512.0111 [
pdf]
Generalized Crude Brauer Inequality on Addition Chains
We extend the inequality due to Alfred Brauer on standard addition chains to a sequence of additions leading to a finite number where at most at most $dgeq 2$ previous terms can be added to generate each term in the sequence.
[59]
vixra:2512.0105 [
pdf]
On Quantization of a Scalar Gravity Field with Feynman's Path Integral Quantization Method
Quantization of a scalar field is a standard text book example of Feynman's path integral quantization. As my findings on the Relativity Theory show that gravitation must be a scalar field, not a tensor field, it is natural to try this quantization method on Nordstrom's and Newton's scalar gravity. It turns out that Feynman's method has many serious errors. The reader doubting it may check the first error very easily. A literature result in equation (11) claimsto give a Green function G(x,x') to the Klein-Gordon operator Box+m^2. If so, (Box+m^2)G=delta(x-x') and if (Box+m^2)y(x)=h(x), then y=int dx' h(x)G(x,x')dx.We see that when integrating over x' the delta peak picks up the value of h(x) becausedelta(x-x') id not 0 in a single point x'=0. But in (11) there is a delta peak delta(x^2)where x^2=|t-t'|^2-|x-x'|^2 is not zero in a single point, it is zero ina subspace. Other errors in Feynman's method are equally clear and real errors. As expected, quantizing gravitation by this method in Section 6 of this article produces a result thatdoes not look correct.
[60]
vixra:2512.0103 [
pdf]
Geodesic Completeness in Schwarzschild Spacetime via Discrete Superluminal Transitions in the Proper Frame
Standard General Relativity predicts that massive particles crossing the event horizon of a black hole inevitably terminate at a spacelike singularity (r=0). This paper proposes a modification to the standard kinematic model of fermions to resolve this geodesic incompleteness. We posit that elementary particles undergo Simple Harmonic Motion (SHM) in the temporal dimension of their proper frame. By treating the speed of light c not as an asymptotic limit but as a phase transition boundary, we show that the electron-positron annihilation vertex is topologically equivalent to a superluminal reflection event. When applied to gravitational collapse, this framework implies that the Event Horizon acts as a Causal Phase Boundary. Upon reaching the horizon, the particle undergoes a CPT inversion relative to the background metric, effectively reinterpreting the horizon not as an entrance to an interior, but as a repulsive phase transition surface. Furthermore, by extending this phase-dependent horizon logic to higher velocity bands (v≥2c), we establish a continuous topology where a single particle oscillates through infinite generations of matter and antimatter, eliminating the physical singularity. Mathematically, this framework suggests that the spacetime metric is Finslerian, possessing a velocity-dependent signature that ensures action stability across superluminal transitions.
[61]
vixra:2512.0102 [
pdf]
The Electron-Positron Pair Creation in Magnetic Field
The probability of the emission of the electron-positron pairs is calculated from the mass operator formalism introduced by Milton at al. (1981). Thisformalism is used to estimate the rate of theelectron-positron pair production by virtualtual synchrotron photons. The rate is governed bythe very small function and therefore the process is beyond the possibility of experimental observation.
[62]
vixra:2512.0101 [
pdf]
The Study of Functions
This is an introduction to the study of real functions, $f:mathbb Rtomathbb R$. We first discuss motivations and examples, ways of representing functions, and with a detailed look into the basic functions, namely polynomials, and $sin,cos,exp,log$. Then we discuss continuity, with the standard results on the subject, and notably with the Weierstrass approximation theorem. We then discuss derivatives, again with the standard results on the subject, notably with the Taylor formula and its applications. Finally, we discuss integrals, with what can be done with Riemann sums, the relation with the derivatives, and with a look into more advanced functional analysis, and several variables too.
[63]
vixra:2512.0098 [
pdf]
Matter-Only Cosmology: A Unified Origin for Inflation and Dark Energy
The standard cosmological model, ΛCDM, successfully describes cosmic acceleration but posits dark energy as a mysterious, independent component of the universe. This paper demonstrates, instead, that dark energy is not a fundamental entity separate from matter, but rather arises as Gravitational Self-Energy (GSE) inherent to matter itself. This model, called Matter-Only Cosmology (MOC), shows that the observed matter density (Ω_m ~ 0.315) naturally generates a dark energy density more than twice as large (Ω_Λ ~ 0.685), driving late-time cosmic acceleration. This is made possible by the dynamic interplay of two competing GSE-induced terms: a negative self-energy component (ρ_{gs}) and a positive interaction component (ρ_{m-gs}), all within standard gravity and without the need for fine-tuning or new fundamental fields. This unified framework elegantly resolves several of the deepest problems in physics. It not only provides a concrete physical origin for dark energy, but also predicts its entire life cycle, showing that it must have been attractive in the early universe, enhancing structure formation, before transitioning to a repulsive phase that drives cosmic acceleration. In doing so, it naturally explains the Hubble tension, the existence of massive galaxies in the early universe, recent indications of a weakening dark energy component, and resolves the cosmological constant coincidence problem. Moreover, MOC unifies the physics of primordial inflation and late-time acceleration as the same GSE dynamics, each with a natural, built-in end mechanism. Finally, by predicting stable, non-singular black hole interiors, MOC offers a resolution to the black hole information paradox. By expressing dark energy as an explicit function of the matter density ρ_m and the horizon scale R, the MOC framework transforms it from a phenomenological parameter into a predictive and falsifiable physical quantity.
[64]
vixra:2512.0097 [
pdf]
Shell-Structured Quantized Masses from Soliton Spectrum
We present a simple model of two interacting Majorana fermions that, in its bosonizedform, exhibits a soliton spectrum in the strong-coupling regime. When the couplings areappropriately tuned, the masses of the composite bosonic excitations follow a quantizedpattern of the form m_(n,N) ≈ n(N + 1)mZα/(2π), where n is a positive integer (principal quantum number), N is a non-negative integer (shell index), m_Z is the Z-boson mass, and α is the fine-structure constant. This spectrum emerges naturally from the multi-soliton and shell-like excitations in the coupled sine-Gordon model and provides a direct realization of the quantized mass formula proposed in [1] for charged fermions. The result suggests that shell-structured quantization may be a universal feature of strongly coupled fermionic systems with periodic potentials.
[65]
vixra:2512.0096 [
pdf]
Some Triconnected Graphs and Their Families Without Hamiltonian Cycles
Translation of Emanuels Grinbergs manuscript into English from Latvian Daži trīssakarīgi grafi un to saimes bez Hamiltona cikliem. In this manuscript flower snark graphs are mentioned that Grinbergs introduced already in 1972.
[66]
vixra:2512.0091 [
pdf]
Complete Mathematical Framework of the Hopf-Fibered 3-Sphere
This document presents a comprehensive mathematical framework for the Hopf-fibered 3-sphere $S^3$. We systematically derive the full geometric, topological, and analytic structure of $S^3$ equipped with its canonical round metric and Hopf fibration $S^1 hookrightarrow S^3 to S^2$. The framework establishes $S^3$'s uniqueness properties, rigidity theorems, and advanced geometric consequences emerging from combinations of its basic structures. All results are presented with complete proofs or references to standard mathematical literature. This article should be viewed as a comprehensive synthesis of canonical structures and standard results associated with the Hopf-fibered round 3-sphere, rather than a source of new classification theorems.
[67]
vixra:2512.0088 [
pdf]
Optimal Control and Performance Analysis of a Solar-PV Microgrid
his paper presents an exhaustive study on the oper-ational effectiveness of a standalone hybrid microgrid, integratinga Diesel Generator (DG) and a Battery Energy Storage System(BESS). The primary objective is to develop and validate acomprehensive control and energy management strategy (EMS)that minimizes the total yearly operational cost while strictlyadhering to stringent power quality and stability standards.We define five distinct operational scenarios (Scenarios 1—5) foreconomic evaluation and four dynamic operational cases (Cases1—4) for technical validation. The economic results demonstratethat the optimal integration strategy (Scenario 5) achieves acost reduction of over **[41.7%]** compared to a diesel-onlyapproach (Scenario 4), primarily through peak shaving andincreased DG efficiency (Figure 2). Technically, a detailed assess-ment of transient and steady-state performance confirms systemcompliance. Key metrics, including Maximum Frequency Devi-ation (kept within **[±0.1 Hz]**), Maximum Voltage Deviation(below ±0.05 p.u.), and Total Harmonic Distortion (THD) atthe PCC (maintained below the 3% limit defined by IEEE 519-2014, as shown in Figure 4), are rigorously analyzed. The paperdetails the component modeling, control hierarchy, and providesextensive discussion on the impact of BESS sizing on system
[68]
vixra:2512.0087 [
pdf]
N-Ary Anticommutators and Generalized Clifford Algebras in Finsler and Spectral Geometry
It is shown that a careful study of the simplest family of generalized Clifford algebras (GCAs) associated with the $N$-th root of unity in $d$-dimensions leads to the following generalized anti-commutator with $ N$ entries $ { e_{i_1}, e_{i_2}, e_{ i_3}, ldots, e_{ i_N} } = e_{i_1} e_{i_2} e_{ i_3} ldots e_{ i_N} + permutations = N! eta_{ i_1 i_2 ldots i_N } e $, where $e$ is the unit element and all the $ N !$ terms of the permutations appear with the same positive sign. The components of the rank-$N$ metric are $ eta_{ i_1 i_2 ldots i_N } = 1, $ iff $ i_1 = i_2 = i_3 ldots = i_ N $, and $0$ otherwise. The range of the indices $ i_1, i_2, ldots i_N $ is $ 1,2, ldots, d $. We proceed to explore the $N$-th norm extensions of the quadratic norm and write down a generalized Finsler-like arc-length based on the rank-$N$ metric $ g_{mu_1 mu_2 ldots mu_N}$. We continue by constructing the different expressions of the Dirac operators associated with the (Generalized) Clifford Spaces corresponding to these GCA's. Dirac operators are essential in the study of Spectral Geometry in Noncommutative Geometry after imposing the correspondence between the geodesic distance and the inverse of the Dirac operator (a fermion propagator). These generalized anti-commutators above are special types of an $N$-ary algebraic structure. We finalize by displaying the relation among GCAs and the algebras underlying the noncommutative fuzzy torus and discuss applications in condensed matter and quantum groups. We conclude with some remarks on $N$-ary algebras and their applications in Mathematics and Physics.
[69]
vixra:2512.0086 [
pdf]
Matrix Representations of Su(3) and Sl(3,C) Lie Algebras Via Fortran 90
The Fortran 90 program included in this article calculates eight matrices that form a basis of the sl(3,C) Lie algebra in an irreducible representation of the user's choice. A quick linear transformation yields a basis for the su(3) Lie algebra. The program checks that the generators satisfy the necessary commutation relations and saves the matrix generators to data files.
[70]
vixra:2512.0084 [
pdf]
Three—Body Problem: Recurrence, Alignment, and Temporal Structure in the Sun—Earth—Moon System
The classical three—body problem is traditionally formulated as the predictionof complete spatial trajectories of three interacting masses under gravitation, a taskknown to be generally non—integrable and chaotic. In this work, we adopt a complementary perspective focused on the Sun—Earth—Moon system, where the most stable and observable features arise not from translational motion but from rotational recurrence and angular phase closure. We introduce an angular—toroidal phase formalism in which the three bodies are represented by periodic phase variables associated with Earth rotation, Earth orbital motion, and lunar orbital motion. These phasesnaturally define a three—torus T 3, within which the system evolves as a helical flow.
[71]
vixra:2512.0082 [
pdf]
Policy Brief: Towards Emotional Healthy AI
Emotion-oriented artificial intelligence(AI)—systems that detect, interpret, or simulate affective states—opens new possibilities for enhancing empathy, emotional literacy, and human—machine understanding (Picard, 1997; McStay, 2018). These technologies promise to support well-being and social connection, yet they also blur the line between genuine empathy and algorithmic manipulation. As emotional inference becomes computational, users may develop psychological dependency on empathic interfaces while being subtly steered by affect-adaptive systems (Bickmore & Picard, 2005; Turkle, 2011). Moreover,affect-recognition models trained on narrow datasets can reproduce bias and misclassify emotions across cultures (Barrett et al., 2019; Benjamin, 2019). Emotional AI thus represents not only a technical innovation but a sociocultural force that reshapes how emotions are defined, valued, and governed (Jasanoff, 2004; Latour, 2005). Developing an emotionally healthy AI policy therefore requires oversight that addresses both the scientific limits of emotion detection and the social consequences of affective manipulation. We propose a sociotechnical AI governance framework for emotional healthy AI that covers key principles, policy recommendation, legislative advice, and technical suggestions.
[72]
vixra:2512.0073 [
pdf]
Computing the Singular Value Decomposition (SVD) with Fixed Point CORDIC Operations with Application to MIMO-OFDM
In this paper, the computation of the Singular Value Decomposition (SVD) of complex matrices will be presented using fixed point arithmetic. The application of CORDIC operations for fixed point implementations of the SVD of complex matrices will be introduced. SVD plays a major role in Closed Loop MIMO OFDM systems. The impact of fixed point implementation of SVD in a Closed Loop MIMO-OFDM system is examined. The ratio of Maximum to Minimum Singular Value (MMSVR) is computed for both fixed point (CORDIC) and floating point operations (using the LAPACK library). The fixed point implementation closely tracks the floating point results over fading channel models. It is shown that for highly ill-conditioned sub carriers the fixed point implementation deviates from the floating point MMSVR. This leads to noise enhancement and degradation of performance. By adding transmit diversity in Closed Loop MIMO-OFDM the MMSVR can be reduced and performance substantially enhanced for the fixed point implementation. It is also shown how SVD can be used in Open Loop MIMO-OFDM systems. This paper is an important introduction to the algorithms implemented in the GitHub repository for MIMO-OFDM:https://github.com/silicondsp/mimo-ofdm-release
[73]
vixra:2512.0071 [
pdf]
Local Quantum Field Theory as an Operational Reconstruction in a Timeless Euclidean Model
We derive a local quantum field theory (QFT) as an operational reconstruction in a timeless four-dimensional Euclidean model with a single fundamental real field satisfying the Laplace equation on a Euclidean space. Using a working domain Ω and an observer’s body Ω<sub>0</sub> ⊂ Ω, we introduce events as readout outcomes on foliation hyperplanes and construct a local algebra of observables. In the special-relativistic regime the invariant null cone and a finite maximal speed v<sub>max</sub> are recovered operationally. Under the assumptions of locality of the transfer and reflection positivity of the state on the algebra of observables, we apply the Osterwalder—Schrader / GNS reconstruction and obtain a Hilbert space of states, unitary evolution, and the Born rule. The complex structure and amplitudes arise from the Euclidean correlation structure, the symplectic form, and the choice of complex structure on the space of local degrees of freedom; the choice of statistics (CCR/CAR) is fixed by the form of the principal symbol of the transfer generator under standard axiomatic conditions.<br><br>The local ambiguity in the choice of observational description is realized as gauge symmetries: U(1) encodes phase freedom, SU(3) is the minimal group admitting color-neutral three-fermion singlets, and SU(2) is a symmetry compatible with the existence of charged chiral currents and anomaly freedom. As a result, the minimal gauge group SU(3)×SU(2)×U(1) is not postulated phenomenologically but is singled out as the minimal one compatible with explicitly stated operational assumptions and anomaly-freedom conditions. The resulting local QFT is consistent with the classical gravitational sector previously derived from the same Euclidean model and admits the formulation of the inverse problem of reconstructing the Standard Model parameters from correlators of the fundamental field and geometric observables. The work continues a program of reconstructing special and general relativity from a timeless Euclidean model and extends it by deriving a local gauge QFT with gauge group SU(3)×SU(2)×U(1), locally isomorphic to the symmetry structure of the Standard Model, from the same operational assumptions.
[74]
vixra:2512.0067 [
pdf]
Magnetism And_superconductivity in Hydrogenated Graphite Foils (Short Version)
We have previously found magnetism and superconductivityin hydrogenated graphite [Gheorghiu, et al. (2020)].Herein, the two phenomena are observed in hydrogenated graphitefoils. As the strength of the magnetic field is increased, thetemperature-dependent magnetization shows several transitionsbetween different states: from Néel paramagnetic, to antiferromagnetic,to ferromagnetic superconductor, to high-temperaturesuperconductor with the critical temperature for the dominantphase Tc ∼ 50—60 K. The latter might be an orbital paramagneticglass ordering of π Josephson-coupled SC domains akinto a macroscopic quantization of the system. The magnetizationloops show the kink feature characteristic to granular SC. Theferromagnetism is observed up to room temperature. Thus, weobserve both magnetism and superconductivity in hydrogenatedgraphite foils.
[75]
vixra:2512.0060 [
pdf]
Light Propagation in a Velocity-Dependent Conformal Spacetime: Effective Medium Analogy, Relativistic Kinetic Energy, and Nonlinear Imaginary Refractive Index
We present a novel interpretation of light prop-agation under velocity-dependent conformal Lorentz transformations (CLT). The conformal factor introduces a coordinate-dependent "medium" through which light propagates, while preserving the physical invariance of the speed of light. Theproperties of this medium are directly linked to relativistic kinetic energy, resulting in a velocity-dependent effective refractiveindex. Furthermore, the inverse transformation for observers in the moving frame gives rise to an imaginary refractive index, reflecting coordinate contraction and phase-like distortions.Nonlinear effects are also considered, where the refractive index varies nonlinearly with velocity or energy. This framework offers a unified view connecting relativistic energy.
[76]
vixra:2512.0055 [
pdf]
From Scratch: A Direct Test of the Sound-Horizon Assumption
This paper evaluates whether the comoving sound horizon can be determined directly from late-time observations. Using only empirically measured expansion rates up to redshift two, together with BAO angle measurements, the analysis shows that the majority of the sound-horizon integral lies in an unmeasured high-redshift domain. Monte Carlo sampling demonstrates that different high-redshift continuations of the expansion history produce consistent late-time distances yet yield widely different sound-horizon values. As a result, the sound horizon is not an empirically recoverable quantity with current data. All late-time determinations of r_s depend on assumptions about the early-Universe expansion history rather than measurements.
[77]
vixra:2512.0051 [
pdf]
On a Certain Misunderstanding in the Interpretation of the Canonical Distribution of the Symbol Sequences
The canonical (Gibbs) distribution is widely used in statistical physics to describe the probabilities of microscopic states characterized by an energy value. In symbolic dynamics and the study of symbolic sequences generated by nonlinear dynamical systems, an analogous construction is frequently applied: the probability of observing a particular symbol sequence is assumed to depend exponentially on an associated "energy", often defined through a cylinder length or a Jacobian-based quantity. While this analogy is technically appealing and mathematically consistent, it has led to a persistent conceptual misunderstanding. The confusion arises when the discrete cylinder lengths are mistakenly interpreted as samples from a continuous distribution, leading to the use of probability density functions where only discrete probabilities are appropriate. In this paper, we analyze the origin of this misunderstanding, clarify the correct interpretation of the canonical distribution in symbolic dynamics, and provide practical guidance for avoiding associated pitfalls. We further illustrate the issue with examples, graphical explanations, and a discussion of implications for numerical studies of chaotic systems.
[78]
vixra:2512.0048 [
pdf]
On the Modification of the Aspect’s Experiment Scheme with Entangled Photons to Eliminate "Superluminal Loopholes"
The article proposes a modification of the Aspect’s experiment to test the hypothesis ofnonlocality, which can be described by Lorentz-invariant equations with infinite-order derivatives. For such a modified experimental scheme, if the hypothesis is correct, the CHSH inequality is expected to hold for critical Cirel’son angles, for example, S=1.414 for a = 0◦,au2032 =45◦, b = 22.5◦,bu2032 = 67.5◦. In the case of the standard, unmodified experimental scheme, theCHSH inequality is known to be violated for these angles, S=2.7. The proposed modificationconsists not only of closing the locality loophole by switching the paths of the entangled photons to polarizers at different angles but also of physically interrupting (blocking) the photonpaths after their separation. Moreover, this interruption must be performed before switchingthe paths to other polarizers
[79]
vixra:2512.0047 [
pdf]
The SUI/LUI Unification Framework: Discrete Noetic Events as the Fundamental Update Rule
This paper introduces the SUI/LUI framework: a discrete, thresholded update law for any system that learns, adapts, or reorganizes. A Smallest Unit of Intelligence (SUI) is defined as a minimal, irreversible update event triggered when an accumulated tension functional of prediction error crosses a critical threshold. Largest Units of Intelligence (LUIs) are the attractor structures carved by long histories of SUIs. From three axioms we derive concrete, falsifiable predictions: learning curves must be piecewise rather than smooth, chaotic physical systems should emit activity in discrete bursts, and high-entropy neural states (such as psychedelics) should show metastable attractor hopping instead of continuous drift. We then run preregistered stress-tests across four independent domains: human and artificial learning curves, asteroid Bennu’s particle ejections, psychedelic dynamic functional connectivity, and cosmic-web topology. In three domains, discrete change-point and state-switching models decisively outperform smooth nulls; in cosmology, current data provide a boundary where no excess discreteness is detected. The result is an empirically grounded unification: the same drift—tension—threshold—jump pattern explains learning, surface chaos, and neural reconfiguration, with explicit fail conditions rather than universal claims.
[80]
vixra:2512.0044 [
pdf]
The Z0 and H0 Bosons as the Ground and First Excited States of a W+W− System
In this paper, we present a compelling alternative to the Standard Model's Higgs mechanism. Our framework is built on two core principles: 1) the mass of particles originates from the self-energy of their associated gauge fields, where composite particles exhibit effective charges arising from gauge dynamics, and 2) certain massive bosons are composite systems. We model the Z^0 and H^0 bosons as the ground and first excited states of a composite W^+ W^− system. The most powerful aspect of this work is our striking, model-independent prediction: the binding distances of Z^0 and H^0 are related by r_H ≈ 2r_Z. This relationship naturally explains their spin difference-vector Z^0 (S=1) and scalar H^0 (S=0)-as triplet and singlet states of the W^+W^− system. Our model makes concrete, falsifiable predictions, including a second excited state with a predicted mass of approximately Z_3 ≈135.4 GeV. The search for this resonance at future colliders constitutes a crucial test that could serve as key experimental evidence for questioning fundamental assumptions about electroweak symmetry breaking. By identifying Z^0 and H^0 as different states of the same underlying system, we explain their masses without invoking the Higgs field, thereby resolving the vacuum energy fine-tuning problem. The Higgs mechanism appears to be an unnecessary construct, as the problem it addresses finds a more natural solution in the inherent properties of effective charges emerging from gauge dynamics and composite structures.
[81]
vixra:2512.0041 [
pdf]
Resolving the Hubble Tension via Intrinsic Supernova Luminosity Evolution: Evidence from Pantheon+
The Hubble Tension—the statistically significant (5σ) discrepancy between Planck and SH0ES determinations of the Hubble constant—is typically interpreted as evidence for either new physics beyond ΛCDM or local geometric anomalies (e.g., a "Local Void"). In this work, we perform a rigorous hypothesis test using the Pantheon+ Type Ia Supernovae dataset (1701 events), incorporating the full systematic covariance matrix. We test two competing models against the standard cosmological baseline: (1) a local geometric underdensity modeled via virial phase-space dynamics, and (2) an astrophysical model allowing for intrinsic luminosity evolution of SNe Ia. Our analysis yields two decisive results: geometric solutions are strongly excluded by the data, while intrinsic luminosity evolution is statistically preferred, aligning the Pantheon+ distance scale with Planck-derived H_0.
[82]
vixra:2512.0038 [
pdf]
Making Dwarf Planets as Planets Without Violating Current Definition of A Planet
The definition of a planet by the International Astronomical Union (IAU) is still ambiguous because the IAU resolution B5 did not clearly define the third criterion of clearing the neighborhood. So, we try to propose a new definition of planets that adopts planets so that it can be applied to all spherical bodies, covering exoplanets as well as planets in the solar system. To do this, we adopt a definition of a planet without violating current definition of a planet.
[83]
vixra:2512.0037 [
pdf]
The Fermion Mass Lattice: A Two-Dimensional Classification of the Mass Spectrum
Building upon the fermion mass ratio formulas established in Paper II, we demonstrate that the complete charged fermion mass spectrum exhibits a two-dimensional lattice structure with basis vectors (5, 6). Every fermion mass satisfies m = mu2091 × φu207f where n = 5a + 6b for integers a, b, and φ = (1+√5)/2 is the golden ratio. The basis vectors have geometric interpretations: 5 = (φ + φu207b¹)² encodes self-referential structure, while 6 = D/τ connects spatial dimension D = 3 to the Aionity fixed point τ = 1/2. We provide complete lattice coordinates for all nine charged fermions, explain the systematic error pattern as renormalization group flow from bare geometric masses to physical dressed masses, and derive predictions for neutrino masses at negative lattice positions. The constraint b_max = D − 1 = 2 explains why exactly three generations of fermions exist. Zero free parameters.
[84]
vixra:2512.0036 [
pdf]
Three [ Plausibe] Errors in Quantum Physics [?]
The article shows that the basic concepts of quantum physics have three major errors that are fatal to the whole theory. The first error is that in Planck's relation $E_n=nhu$ oscillators have only one energy level $E=hu$, the article explains how it arises, while the higher energy levels $nhu$ must be understood as runs of $n$ wavelengths of an oscillator. The second error is in Einstein's relation$E=pc$, in de Broglie's wavelength $lambda=h/p$ and in the concept of particle-wavedualism. The error is that $p$ in $E=pc$ is not the momentum of a photon and $p=Ev$ from de Broglie's wavelength $p=h/lambda$ and Planck's relation $E=hu$ is not the momentum of an oscillator. In both cases the correct formula is $p=mv_p$ where $v_p$ is the propagation speed of a matter wave. This error implies that particle-wave dualism is false. The third error is in the substitutions in the Schr"odinger equation. The substitution of momentum is in error because it confuses the momentum of an oscillator with the momentum in de Broglie's wavelength formula. The substitution of energy is incorrect becauseoscillators do not have several energy levels. As a result of the third error it is incorrect to make a Fourier transform from the momentum coordinates to spatial coordinates, which means that a basic method in the quantum field theory fails.
[85]
vixra:2512.0035 [
pdf]
Apparent Momentum in Compton Scattering
Compton scattering seemingly verifies the relativistic kinetic energy formula, but the article shows that this is not the case. The relativistic kinetic energy formula can be refuted in several ways, several of these ways are described in the text. The articleexplains how Compton scattering can be understood in the context of apparent mass.
[86]
vixra:2512.0033 [
pdf]
Regulatory Frameworks for Generative AI Enabled Digital Mental Health Devices: Safety, Transparency, and Post-Market Oversight
The rapid growth of generative artificial intelligence in digital mental health interventions offers significant opportunities to improve mental healthcare access while creating new regulatory challenges. This paper responds to recent U.S. Food and Drug Administration initiatives, including the September 2025 Digital Health Advisory Committee meeting, by proposing comprehensive regulatory frameworks for generative AI digital mental health devices. We analyze the current regulatory landscape, identifying gaps in U.S., international, and state-level governance structures. Through quantitative foundations including mathematical models for risk assessment, objective functions for regulatory optimization, and the 4 lens framework for significant change evaluation, we establish evidence-based approaches for device assessment. We present architectural diagrams covering lifecycle regulatory pathways, multi-layered safety architectures, risk-tiered assurance frameworks, and multi-stakeholder governance models. Drawing from clinical evidence showing both potential benefits and significant risks, we advocate for balanced regulatory approaches. Our framework integrates technical safeguards, ethical considerations based on care ethics, transparency requirements, and post-market monitoring systems. We provide implementation roadmaps, quantitative algorithms for regulatory decisions, and cost-benefit analyses to support practical deployment. The paper concludes with specific recommendations for risk-based classification, adaptive oversight systems, international coordination, and enhanced professional involvement to ensure these technologies provide therapeutic benefits while maintaining strong patient safety standards throughout their lifecycle. This is a review and synthesis paper that summarizes and organizes existing proposals, frameworks, and discussions from current literature; the author does not claim original authorship of the regulatory frameworks presented but rather provides a systematic analysis of the current discourse.
[87]
vixra:2512.0030 [
pdf]
Universal Divisibility Framework: A Unified Theory of Divisibility1 Across Integer, Rational, Real, and Complex Domains
This paper introduces the Universal Divisibility Framework (UDF), a comprehensive mathematical theory that extends the classical notion of divisibility from integers to rationals, reals, and complex numbers. The framework is built upon the Universal Divisibility Function $d(a, b, c) = lfloor a/c floor (b bmod c) - lfloor b/c floor (a bmod c)$, which provides a unified criterion for divisibility across multiple number systems. We establish the Universal Divisibility Theorem, proving that for $a, b in mathbb{R}$ with $b eq 0$, and an integer $c$ satisfying $lfloor b/c floor = pm 1$, we have $b mid a$ if and only if $d(a, b, c) equiv 0 pmod{b}$. This framework not only recovers all classical integer divisibility rules as special cases but also eliminates false positives that arise when traditional rules are naively extended to non-integer domains. We provide explicit divisibility formulas for numbers 1—1000, demonstrate applications to Diophantine equations and matrix algebras, and discuss implications for computational number theory and cryptography.
[88]
vixra:2512.0028 [
pdf]
Inertial Saturation: Phenomenological Regularization of Relativistic Dynamics and the Role of Gravitational Potential
In standard relativistic physics, the divergence of energy density and inertial mass as velocity approaches the speed of light (v → c) represents a classical singularity, indicating the asymptotic limitation of the mathematical model. In this paper, we introduce the "Principle of Inertial Saturation" (PIS) as a phenomenological mechanism that regularizes the Lorentz factor via an effective vacuum parameter (sigma). The model is based on a generalized Mach's principle, where the local inertial limit is dynamically determined by the gravitational potential. This version includes an Independent AI Technical Validation Report (Appendix A) verifying the mathematical consistency of the PIS model and its specific predictions for lunar energy saturation limits.
[89]
vixra:2512.0027 [
pdf]
Quantum Foundations, Quantum Gravity, and Origin of Inertia: A Definitive New Realist Framework
This volume presents a comprehensive realist framework encompassing quantum foundations, quantum gravity, and the origin of inertia (Mach's Principle). It develops and builds upon a quantum ontology consisting of two fundamental ontic entities, called W-state and P-state, that respectively account for the wave- and particle-like aspects of quantum systems.W-state is a generalization of the wavefunction, but has ontic stature and is defined on the joint time-frequency domain. It constitutes a non-classical local reality, consisting of superpositions of quantum waves writ small. P-state enforces entanglement obligations and mediates the global coordination within quantum systems required to bring about wavefunction collapse in causal fashion consistent with special relativity.Quantum theory is rebuilt from the ground up, and the development then proceeds to quantum gravity, which is solved surprisingly easily once a good quantum foundations solution is in hand. It is solved not in the Planck regime, but in the testable regime of classical general relativity. The two great theories are intermeshed, and proposed testable solutions for the dark matter conundrum and origin of inertia are set forth. Both depend on non-local gravitational sourcing via P-state. The overall result is a robust architectural foundation for a (still elusive) Theory of Everything.
[90]
vixra:2512.0026 [
pdf]
A Geometric Model of Quantum—Like Behavior: Spin, Measurement, and Probabilities from Pure Spatial Geometry
We present a minimal geometric framework in which several quantum—like phenom-ena arise without introducing Hilbert spaces, operators, or probabilistic postulates.The only fundamental structure is a spatial Riemannian manifold (Σ, hij ) on whichspin states, measurement axes, and multi—particle configurations are represented asvectors of equal norm. Probabilities and correlations emerge from purely geometricrelations between these vectors, most importantly from their angular separation.A single geometric expression reproduces both single—particle Born probabilitiesand the − cos θ EPR correlation, depending only on how two geometric directionsare interpreted. Entanglement is reinterpreted as a constraint identifying whichpairs of vectors must be compared, rather than as a tensor—product structure. Theresulting model is deterministic at the level of geometry but yields quantum—likeprobabilistic predictions through normalization and symmetry arguments.
[91]
vixra:2512.0025 [
pdf]
2D Asymmetric Risk Theory (ART-2D)
We propose the 2D Asymmetric Risk Theory (ART-2D), a framework for quantifying systemicfragility using Langevin dynamics. We propose the 2D Asymmetric Risk Theory (ART-2D), a rigorous framework for quantifying systemic fragility in complex adaptive systems. Breaking with the temporal prediction paradigm — precluded by the Efficient Market Hypothesis — we redefine risk monitoring as the detection of structural phase transitions, analogous to financial seismology. We derive a Universal State Vector u20d7Σ(t) from coupled Langevin dynamics between convex Principals and concave Agents, isolating two orthogonal order parameters: Structural Asymmetry (AS), derived via Itô calculus, and Informational Asymmetry (AI), quantified by Kullback-Leibler divergence under Girsanov’s Theorem. The master equation Σ = AS × (1 + λ · AI), calibrated with a universal coupling constant λ ≈ 8.0, produces a scalar metric of proximity to bifurcation. We identify a critical threshold Σcrit = 0.75 separating metastable regimes (Green) from unstable regimes (Red). Empirical validation covering the 2008 Global Financial Crisis, the 2022 Terra/Luna collapse, and COVID-19 hospital saturation reveals a Conditional Risk Amplification Factor (CRAF) exceeding 6.0x in endogenous systems. We extend the model to include spectral contagion in networks and stochastic optimal control, proposing the integration of ART-2D as a physics-based substitute for lagged Basel III macroprudential indicators.
[92]
vixra:2512.0021 [
pdf]
Geometric Entropic Framework: Alternative Perspective to Current Spacetime by Information Geometry
We present a reformulation of fundamental physics in which temporal evolution emerges fromgeometric correlations across an information-theoretically motivated foliation of spacetime. The framework is defined on a four-dimensional Lorentzian manifold (M,gAB ) equipped with a scalar entropy field swhose level sets define "entropic layers." Quantum states are represented as sections of a Hilbert bundle over this foliation, with dynamics governed by a single timeless constraint ˆequationCΨ = 0 that encodes geometric flow via an operator-valued connection Dw.We prove a correspondence theorem demonstrating that in the semiclassical weak-layer regime (ε:= |gAB ∇As∇B s|≪1), the framework reproduces Einstein’s field equations and the Schr¨odingerequation relative to any observer-chosen relational clock c= C[s]. The kinetic coefficient Z(s) of the entropy field is uniquely determined by the Fisher information metric of local probability distribu-tions, connecting continuum dynamics to information geometry and distinguishing this framework from generic scalar-tensor theories.Phenomenological predictions include Yukawa-type corrections to Newtonian gravity with cou-pling strength and range constrained by fifth-force experiments (|α|< 10−2 for λs ∼1 mm), geo-metric Berry phases in atom interferometry, curvature-induced decoherence from bundle geometry, and effective dark-energy behavior in cosmology. We compare the framework to Wheeler-DeWitt theory, Page-Wootters relational mechanics, shape dynamics, and entropic gravity approaches, clarifying both conceptual similarities and essential mathematical differences. The framework provides aunified geometric substrate for gravity, quantum mechanics, and thermodynamics without invokingfundamental time as a primitive element.
[93]
vixra:2512.0020 [
pdf]
Spinning Electrons on Pendulum-Paths in Hydrogen Atoms
While electronic orbitals with zero orbital angular momentum are a standard feature of modern quantum mechanics, the corresponding linear electron paths with zero orbital angular momentum ("pendulum-paths") were explicitly excluded in the "old quantum theory" because of concerns that an electron on such paths would collide with the atom's nucleus. More recently, some researchers hypothesized that models of spinning electrons allow for electrons on pendulum-paths without collisions with the nucleus. In the present work, the scenario of a spinning electron in a hydrogen atom on a pendulum-path was numerically simulated using the bi-level electron model. The resulting trajectories were evaluated by comparing time-averaged powers of the distance between electron and proton with corresponding time-averaged values in an improved variant of the Bohr-Sommerfeld model as well as with quantum mechanical expectation values. The numerical results for a spinning electron were in better agreement with quantum mechanical expectation values than the results for the improved Bohr-Sommerfeld model.
[94]
vixra:2512.0018 [
pdf]
Generalized Scattering Opera-Tor Preserving Hermiticity, Unitarity, Causality and Con-Vergence: Scattering Matrix Without Infinity
We derive an alternative time-evolution operator for the Heisenberg picture in five rigorous ways with different starting points to confirm its validity and generality.This time-evolution operator called the generalized time-evolution operator is an analytical scattering operator that is obtained in a nonperturbative way unlike the Dyson series based on a perturbative approximation.We verify that the obtained scattering operator thoroughly preserves the Hermiticity, unitarity, causality of the scattering operator which are the basic requirements for the consistent scattering operator.It is analyzed that the Dyson series does not guarantee the Hermiticity, unitarity, causality of the scattering operator, and thus is not consistent.It is demonstrated that our formulation based on the generalized time-evolution operator does not need the Feynman diagram and renormalization and therefore there does not exist the infinity problem within the framework of the theory that we constructed.Ultimately, it is shown that the new formulation enables us to construct a consistent scattering theory which, beyond the infinity problem, satisfies all necessary requirements for the scattering operator.
[95]
vixra:2512.0016 [
pdf]
Regulatory Reform for Agentic AI: Addressing Governance Challenges in Federal AI Adoption
The rapid advancement of artificial intelligence (AI), particularly agentic AI systems capable of autonomous decision-making, has exposed significant gaps in existing federal regulatory frameworks. This paper examines the regulatory barriers inhibiting AI innovation and adoption identified in the Office of Science and Technology Policy's (OSTP) Request for Information (RFI) on regulatory reform. We analyze five categories of barriers—regulatory mismatches, structural incompatibility, lack of clarity, direct hindrance, and organizational factors—and propose a comprehensive governance framework integrating technical standards, risk management protocols, and policy recommendations. Drawing from extensive literature on AI governance tools and frameworks, we present actionable solutions for modernizing federal regulations to foster responsible AI innovation while maintaining public trust and safety.