General Science and Philosophy

[1] vixra:2401.0030 [pdf]
Dictionary of Ayurveda by Dr. Ravindra Sharma and the Graphical Law
We study Dictionary of Ayurveda by Dr. Ravindra Sharma belonging to the Green Foundation, Dehradun, India. We draw the natural logarithm of the number of entries, normalised, starting with a letter vs the natural logarithm of the rank of the letter, normalised. We conclude that the Dictionary can be characterised by BW(c=0.01),the magnetisation curve of the Ising Model in the Bragg-Williams approximation in the presence of external magnetic field, H. $c=frac{ H}{gamma epsilon}=0.01$ with $epsilon$ being the strength of coupling between two neighbouring spins in the Ising Model, $gamma$ representing the number of nearest neighbours of a spin, which is very large.
[2] vixra:2312.0073 [pdf]
The Intent of Hume's Discussion on the Existence of the External World
Exploring the concept of the external world's existence has been a focal point within the domain of epistemological inquiry throughout the annals of philosophy. Numerous thinkers have grappled with the question of whether one can truly fathom the existence of the external world and, if so, how such comprehension can be attained. Among these intellectual explorers stands David Hume, who approaches our perceptions of the external world as deeply rooted in matters of belief. Hume critically examines the belief in the enduring and distinct presence of external entities, even when these entities escape active perception. This inquiry delves into the origins of the belief in an external world that persists independently of our cognitive processes and sensory experiences, probing the cognitive faculties responsible for shaping such convictions. Through this exploration, it is asserted that Hume's primary aim is to illuminate the epistemological significance embedded within such beliefs.
[3] vixra:2311.0008 [pdf]
Evolution of Information and the Laws of Physics
This paper combines insights from information theory, physics and evolutionary theory to conjecture the existence of fundamental replicators, termed `femes'. Femes are hypothesised to cause transformations resulting in the structure and dynamics of the observable universe, classified as their phenotype. A comprehensive background section provides the foundation for this interdisciplinary hypothesis and leads to four predictions amenable to empirical scrutiny and criticism. Designed to be understood by a multidisciplinary audience, the paper challenges and complements ideas from various domains, suggesting new directions for research.
[4] vixra:2309.0004 [pdf]
The Computation of P and NP with Photophonon Stargates
We give a discourse on symmetry and singularity and the construction ofphotophonon stargates, and use them to create computers that decide andverify languages, including proofs, in polynomial time. Photophonons arequasiparticles, or synonymously stargates, that form from the oscillatoryfolding of singularities of cosmic light and cosmic sound with a synergetion, anovel quasiparticle. We shall find that, at each step in the computation oflanguages of any complexity, there exists a corresponding emission spectra ofphotophononics, and where upon examination, we observe when P = NP.
[5] vixra:2308.0017 [pdf]
Vehicle Longitudinal Dynamics Model
This technical report presents a MATLAB Simulink model that represents the longitudinal dynamics of an actual vehicle with remarkable accuracy. Through validation against empirical data, the model demonstrates a close adherence to the real-world behaviour of a vehicle, encompassing key aspects such as acceleration, braking, and velocity control. With its versatility and applicability in various engineering domains, this model is a valuable tool for automotive research, aiding in developing advanced control systems and autonomous driving technologies.
[6] vixra:2308.0010 [pdf]
Alone in the Universe
The history of science shows the effectiveness of analogies, complementing rigorous formalism. But this practice has been neglected, leading to the current bankruptcy of official cosmology. Here are 30 formulas giving the Hubble radius, including 3 linked to the Solar System and two linking the cosmos and the Egyptian Nombrol 3570, linked to the 17th power of the golden ratio, holographically defining the meter from the terrestrial radius. In addition to these 5 specific relationships, there are 14 directly solanthropic relationships. This is comparable to Jean Perrin's book "Les atomes", which brings together 13 independent formulas involving Avogadro's number. But these relations are precise to the nearest thousandth, making a total improbability of 10^{-3 times 44 = -132}, whereas for Perrin's 10 % precise relations it's more like 10^{-1 times 13 = -13}. But Perrin definitively imposed the idea of the atom, i.e. the negation of the infinitely small. Here, we're talking about both the negation of the infinitely large and the "infinitely insignificant" advocated by officials. textit{We are therefore alone in the Universe}, which the James Webb telescope should confirm, after the brigth rejection of Initial Big Bang cosmology from the "Universe breakers galaxies".
[7] vixra:2307.0117 [pdf]
Networked Robots Architecture for Multiple Simultaneous Bidirectional Real Time Connections
The Architecture for Networked Robots presented in this paper, is designed so that entities at the Enterprise level, such as a Java Application can access multiple Robots with real-time, two-way, on-demand reading of sensors and the control over Robot motion (actuators). If an application can simultaneously have access to the sensors of multiple Robots, then sophisticated algorithms can be developed to coordinate the movement of multiple robots. The simultaneous combined full knowledge of all aspects of the Robot's sensors and motion control, open up the capability to make multiple Robots act in a coordinated and purposeful way. In addition, the Networked Robots Architecture allows for multiple Enterprise Entities to have simultaneous access to the same Robot. A significant aspect of this architecture is that multiple independent entities can simultaneously access the Robot through a real time connection. For example, while a Java Application is monitoring and controlling a Robot, another entity such as an HTML5 WebSocket Client can also control and monitor the same Robot through a Web Browser. A multi-threaded WebSocket Server with routing is combined with a separate multi-threaded TCP/IP Server called the Frontline Server. The Robots connect through the Frontline Server which creates a thread per connection and connects to the WebSocket Server. The WebSocket server accepts connections through Enterprise Applications( e.g. Java based) and Remote Web Based Applications. Each Robot has a unique identification (48 bit represented in hexadecimal) and a truncated WebSocket Session ID that is maintained throughout the connections, including in the Robot's Firmware. Both WiFi LAN and 4G LTE WAN are supported with Robots in both networks accessible through the Internet.
[8] vixra:2306.0100 [pdf]
Sprache Als Konstrukt: Eine Untersuchung Der Implikationen Des Linguistischen Nominalismus (Language as a Construct: an Examination of the Implications of Linguistic Nominalism)
Dieses Papier untersucht das Thema der stabilen Bedeutungen und Referenzen in der Sprache. Es beleuchtet die Debatte darüber, ob Wörter und sprachliche Symbole feste und unveränderliche Bedeutungen haben können, die eine zuverlässige Kommunikation und Verständnis ermöglichen. Der Text betrachtet verschiedene Perspektiven und Theorien zur Bedeutung von Wörtern, zur Beziehung zwischen Sprache und Welt sowie zur Rolle des Kontexts und der Interpretation in der Kommunikation. Es werden Argumente für die Möglichkeit stabiler Bedeutungen und Referenzen präsentiert, während skeptische Ansichten betonen, dass Bedeutungen subjektiv und kontextabhängig sind.<p>This paper explores the issue of stable meanings and references in language. It sheds light on the debate over whether words and linguistic symbols can have fixed and unchanging meanings that enable reliable communication and understanding. The text considers different perspectives and theories on the meaning of words, the relationship between language and the world, and the role of context and interpretation in communication. Arguments are presented for the possibility of stable meanings and references, while skeptical views emphasize that meanings are subjective and contextual.
[9] vixra:2305.0099 [pdf]
The East and West Philosophies: A Comparison of Geometric and Algebraic Structures
The East and West philosophies are often compared based on their cultural differences and historical backgrounds. However, this paper aims to compare these two philosophical traditions using mathematical concepts. The Eastern philosophy can be compared to a geometric structure, where there is a strong sense of determinism and order. On the other hand, the Western philosophy can be compared to an algebraic structure, where there is a high degree of uncertainty and a need for observation and experimentation. This paper argues that these two structures represent different ways of understanding the world, and that both have their strengths and limitations.
[10] vixra:2304.0170 [pdf]
Gerhart Enders as a Scientist
This is a largely revised and extended translation of my article 'Gerhart Enders als Wissenschaftler. Zum 90. Geburtstag am 17. Oktober 2014, Brandenburgische Archive 32 (2015) 77-79, https://opus4.kobv.de/opus4-slbp/files/8026/Brandenburgische+Archive+32.pdf
[11] vixra:2304.0117 [pdf]
Pythagorean Nature
Crucial Pythagorean scientific developments, checkable by everyone, have been missed, refutating the Universe expansion and the initial Big Bang model, imposing the cosmological steady-state model. The single electron cosmology gives a close estimation of the Hubble length, meaning the matter is in fact a matter-antimatter oscillation in a Permanent Bang cosmology, where dark matter is an out of phase oscillation. The nuclear fusion cosmic model gives the background temperature 2.73 Kelvin, validating the Hoyle's prediction of permanent neutron creation, an ultimate limit of physics. The Diophantine treatment of the Kepler laws induces the Space-Time quantification in a Total Quantum Physics, pushing back the Planck wall by a factor 10^{61}, resolving so the vacuum energy dilemma. The three-body gravitational hydrogen model explains the Tachyonic Three Minutes Formula giving half the Hubble radius, thus its critical mass, showing the Universe is a Particle in the Cosmos, whose radius is deduced from holographic Space-Time Quantification. The Kotov Doppler-free oscillation rehabilitates the tachyonic physics of the bosonic string theory in the Octonion Topological Axis prolonging the Quaternion Periodic Table, implying the string-spin identification and gives G, compatible with the BIPM measurements but 2x10^{-4} larger than the official value. This confirms the Higgs mass is tied to the third perfect couple 495-496. The so-called "free parameters", as well as the Archimedes pi, are confirmed to be computation basis, in liaison with the Holography and Holic Principles, opening the way to a revolution in mathematics where the Happy Family of the sporadic groups and the Egyptian Nombrol 3570 are central. The parameter values are deduced in the ppb domain by Optimal Polynomial Relations involving the Large Lucas Prime Number, the forth (last) term of the Combinatorial Hierarchy. The photon-neutrino background manages to divide this prime number by holographic terms respecting the symmetry electron-hydrogen. The data analysis rehabilitates the Wyler's and the Eddington theory, which predicted correctly the supersymmetry Proton-Tau. The tachyonic synthesis defines the Neuron, the characteristic time of the neuro-musical Human, corresponding to 418/8 Hz, three octavus down the La bemol for the chording 442.9. The Total Quantum Physics introduces the Human Measure Mass x Heigth, and connects with the Solar system, the CMB and the DNA through musical scales, introducing the Cosmobiology where the CMB is identified with the genetic code of the Universe (Truncated by viXra Admin).
[12] vixra:2304.0005 [pdf]
Peer2Panel: Democratizing Renewable Energy Investment With Liquid and Verifiable Tokenized Solar Panels
With an expected investment cost of $sim$$100 trillion within the next decades, renewable energy is at the heart of the United Nation's transition to net-zero emissions by 2050. Unfortunately, there are several challenges associated to these investments, such as the low exit liquidity and the hassles of going through centralized agencies. Investments in renewable energy is thus currently mostly limited to governments, corporate, and wealthy individuals. At Peer2Panel (P2P), we address these issues by tokenizing solar panels into unique SolarT NFTs on the Ethereum blockchain, where we function as an intermediary between a token-owning individual and a physical solar panel. Apart from panels installation and maintenance, our role is to redistribute the profits from the generated energy to the SolarTs holders, thus making investments in SolarTs transparent, democratic, and liquid. In fact, ownership of a SolarT token gives a direct ownership interest in the solar panels owned by P2P, which can then be exchanged freely on-chain and thus remove most of the hassles of the traditional energy market. In addition, P2P leverages the most recent innovations from the decentralized finance (Defi) ecosystem to propose SolarTs-collateralized loans, instant liquidity and multi-chain interoperability to its investors.
[13] vixra:2301.0101 [pdf]
Modeling Bias in Vaccine Trials Relying on Fragmented Healthcare Records
COVID-19 vaccine trials depend on the localization of vaccination records for each trial subject. Misclassification bias occurs when vaccination records cannot be localized or uniquely identified. This bias may be significant in trials where the trial subjects’ vaccination and health records are distributed between more than one database. The potential for this bias is present in numerous published COVID-19 vaccine trials. A model is proposed for estimation of the magnitude of this bias on apparent vaccine efficacy. In the model, misclassification is always in the direction from partial or fully vaccinated status to unvaccinated status. The model predicts a disproportionate effect of vaccination status misclassification on the apparent vaccine efficacy when population vaccination rates are high.
[14] vixra:2301.0022 [pdf]
Properties of Elementary Particles, Dark Matter, and Dark Energy
This paper suggests new elementary particles, a specification for dark matter, and modeling regarding dark-energy phenomena. Thereby, this paper explains data that other modeling seems not to explain. Suggestions include some methods for interrelating properties of objects, some catalogs of properties, a method for cataloging elementary particles, a catalog of all known and some method-predicted elementary particles, neutrino masses, quantitative explanations for observed ratios of non-ordinary-matter effects to ordinary-matter effects, qualitative explanations for gaps between data and popular modeling regarding the rate of expansion of the universe, and insight regarding galaxy formation and evolution. Key assumptions include that nature includes six isomers of most elementary particles and that stuff that has bases in five isomers underlies dark-matter effects. Key new modeling uses integer-arithmetic equations; stems from, augments, and does not disturb successful popular modeling; and helps explain aspects and data regarding general physics, elementary-particle physics, astrophysics, and cosmology.
[15] vixra:2211.0123 [pdf]
Preventing Advanced Eugenics and Generational Testosterone Decline
The goal of the paper is to prevent eugenics against testosterone, by disclosing it’s possible shapes, because the first step of preventing it, is to know that the idea/weakness exists. We cannot protect our selves from something that we don’t know exists. We should not neglect the fact that testosterone levels have been dropping a lot in the last decades [1] [2] [3] [4], across generations, and independently of age.
[16] vixra:2211.0049 [pdf]
Vacío Y Energía (Vacuum and Energy)
La propuesta de este documento es mostrar que la energía del vacío y la tarea de producir vacío y están ligadas en forma directa e inmediata. Ni en las aulas ni en la bibliografía estudiantil es mencionado ese detalle. El objetivo es remediar la carencia.<p>The proposal of this document is to show that the energy of the vacuum and the task of producing the vacuum are directly and immediately linked. Neither in the classrooms nor in the student bibliography is this detail mentioned. The objective is to remedy the deficiency.
[17] vixra:2209.0055 [pdf]
New Quantum Spin Perspective and Space-Time of Mind-Stuff
The fundamental building block of the loop quantum gravity (LQG) is the spin network which is used to quantize the physical space-time in the LQG. Recently, the novel quantum spin is proposed using the basic concepts of the spin network. This perspective redefines the notion of the quantum spin and also introduces the novel definition of the reduced Planck constant. The implication of this perspective is not only limited to the quantum gravity; but also found in the quantum mechanics. Using this perspective, we also propose the quantization of the mind-stuff. Similarity between the physical space-time and the space-time of the mind-stuff provides novel notions to study the space-time scientifically as well philosophically. The comparison between the physical- space-time and the space-time of the mind-stuff is also studied.
[18] vixra:2206.0095 [pdf]
The Undecidable Charge Gap and the Oil Drop Experiment
Decision problems in physics have been an active field of research for quite a few decades resulting in some interesting findings in recent years. However, such research investigations are based on a priori knowledge of theoretical computer science and the technical jargon of set theory. Here, I discuss a particular, but a significant, instance of how decision problems in physics can be realized without such specific prerequisites. I expose a hitherto unnoticed contradiction, that can be posed as a decision problem, concerning the oil drop experiment and thereby resolve it by refining the notion of ``existence'' in physics. This consequently leads to the undecidability of the charge spectral gap through the notion of ``undecidable charges'' which is in tandem with the completeness condition of a theory as was stated by Einstein, Podolsky and Rosen in their seminal work. Decision problems can now be realized in connection to basic physics, in general, rather than quantum physics, in particular, as per some recent claims.
[19] vixra:2206.0049 [pdf]
Blocking Aircraft
Arranging a small type decoy between an aircraft and an aam and making the decoy meet the aam or using various kinds of ways, we free the aircraft from the aam's threat.
[20] vixra:2205.0154 [pdf]
Unprovability of First Maxwell's Equation in Light of EPR's Completeness Condition
Maxwell's verbal statement of Coulomb's experimental verification of his hypothesis, concerning force between two electrified bodies, is suggestive of a modification of the respective computable expression on logical grounds. This modification is in tandem with the completeness condition for a physical theory, that was stated by Einstein, Podolsky and Rosen in their seminal work. Working with such a modification, I show that the first Maxwell's equation, symbolically identifiable as ``$\vec{\nabla}\cdot\vec{E}=\rho/\epsilon_0$'' from the standard literature, is {\it unprovable}. This renders Poynting's theorem to be {\it unprovable} as well. Therefore, the explanation of `light' as `propagation of electromagnetic energy' comes into question on theoretical grounds.
[21] vixra:2204.0126 [pdf]
Scientific Method and Game Theory as Basis of Knowledge and Language
We use methods of science (parts of falsificationism) and game theory (focal points) as a foundation of knowledge and language. We draw some parallels to human sensory experience, using recent progress in AI and demonstrate how do we know basic facts about space or ourselves or other people. Then we demonstrate how we can understand and make language with these methods, giving examples from Tok Pisin language. Then we demonstrate the viability of this approach for clarification of philosophy. We demonstrate that our theory is a good answer to many linguistic conundrums given in "Philosophical Investigations" by Wittgenstein. We also demonstrate an application to other philosophical problems.
[22] vixra:2202.0170 [pdf]
Evolution of the Universe in an Infinite Space
This hypothesis considers the current universe to be a result of evolution in an infinite data space. The laws and properties of the universe are explained in terms of their function as evolutionary products. There is evidence for this hypothesis in the form of error correcting codes (see section 2.4).
[23] vixra:2202.0054 [pdf]
Inadequacy of Classical Logic in Classical Harmonic Oscillator and the Principle of Superposition
In course of the development of modern science, inadequacy of classical logic and Eastern philosophy have generally been associated only with quantum mechanics in particular, notably by Schroedinger, Finkelstein and Zeilinger among others. Our motive is to showcase a deviation from this prototypical association. So, we consider the equation of motion of a classical harmonic oscillator, and demonstrate how our habit of writing the general solution, by applying the principle of superposition, can not be explained by remaining within the bounds of classical logic. The law of identity gets violated. The law of non-contradiction and the law of excluded middle fail to hold strictly throughout the whole process of reasoning consequently leading to a decision problem where we can not decide whether these two `laws' hold or not. We discuss how we, by habit, apply our intuition to write down the general solution. Such intuitive steps of reasoning, if formalized in terms of propositions, result in a manifestation of the inadequacy of classical logic. In view of our discussion, we conclude that the middle way ({\it Mulamadhyamakakarika}), a feature of Eastern philosophy, founds the basis of human reasoning. The essence of the middle way can be realized through self-inquiry ({\it Atmavichar}), another crucial feature of Eastern philosophy, which however is exemplified by our exposition of the concerned problem. From the Western point of view, our work showcases an example of Hilbert's axiomatic approach to deal with the principle of superposition in the context of the classical harmonic oscillator. In the process, it becomes a manifestation of Brouwer's views concerning the role of intuition in human reasoning and inadequacy of classical logic which were very much influenced by, if not founded upon, Eastern philosophy.
[24] vixra:2201.0112 [pdf]
A Note on Mass and Gravity
The principle of equivalence implies the inertial mass equals to gravitational mass. Gravity is understood in terms of the quark model, amended by Platonic symmetry. This allows to comment on the origin of inertial mass and how it can be controlled when controlling gravity.
[25] vixra:2201.0073 [pdf]
The Subtle Curse of Creative-Social Creatures, the Truth Behind a Inevitable Mankind Invention: Christianity
Do Christians understand Christianity, or do they have faith? Can you destroy a religion by replacing understanding with faith? Using overwhelming objective arguments, we claim to have decoded and "unearthed" Christianity (and probably, if they exist, Christian-like religions too), introducing the authentic version of Christianity, explaining many of it's fundamental aspects. We even proved the compatibility between Christianity (creationist) and Darwinism, and proposed a shockingly eerie hypothesis for the question: "Why would God allow undeserved suffering?". The philosophy of life, with it's objective arguments, was hiding under our noses, disguised as something else. Like gravity is only an illusion (according to Einstein's General Theory of Relativity), and like borders are social constructs, and like fiat money (which has no intrinsic value) is social construct, is corruption also only an illusion, and a social construct? We suggest that the answer is yes: corruption is (sometimes, partly) a (curable) social construct, that stems from misunderstandings, psychological defects (created by evolution), incentives, conflict of interest, and lack of trust, a social construct supported by inheritable things (sins) such as war, and antisocial systems. Occasionally we propose ideas to help combat and prevent both corruption and inheritance of sin. For many years, I thought that I was an agnostic atheist, but now I know that the reason why I was agnostic, is because my God was literally the truth. If you exist, then that means other people like you also exist. Remember: freedom of expression and the truth are the most valuable things, however, the authorities might disagree with our version of freedom of expression. This article has missing information, because "art is never finished, only abandoned" (Leonardo da Vinci), experts make mistakes, the author is brainwashed, so please do not believe everything that is written, in this article, just because you like or believe some or most things you found here (see cognitive biases: the halo effect, confirmation bias, frequency bias, and potentially others)! However, if you are not hated or quizzaciously ridiculed for the things you say, then you are not a good philosopher.
[26] vixra:2111.0027 [pdf]
Scientific Value of the Quantum Tests of Equivalence Principle in Light of Hilbert's Sixth Problem
In his sixth problem, Hilbert called for an axiomatic approach to theoretical physics with an aim to achieve precision and rigour in scientific reasoning, where logic and language (semantics) of physics play the pivotal role. It is from such a point of view, we investigate the scientific value of the modern experiments to perform quantum tests of equivalence principle. Determination of Planck constant involves the use of acceleration due to gravity of the earth (g) that results in the force on a test mass. The equivalence between inertial mass and gravitational mass of a test object is assumed in the process of logically defining g from the relevant hypotheses of physics. Consequently, if Planck constant is used as input in any experiment (or in the associated theory that founds such an experiment) that is designed to test the equivalence between inertial and gravitational mass, then it is equivalent to establish a scientific truth by implicitly assuming it i.e. a tautology. There are several notable examples which plague the frontiers of current scientific research which claim to make quantum test of equivalence principle. We question the scientific value of such experiments from Hilbert's axiomatic point of view. This work adds to the recently reported semantic obstacle in any axiomatic attempt to put "quantum" and "gravity" together, albeit with an experimental tint.
[27] vixra:2111.0026 [pdf]
Cauchy's Logico-Linguistic Slip, the Heisenberg Uncertainty Principle and a Semantic Dilemma Concerning ``quantum Gravity''
The importance of language in physics has gained emphasis in recent times, on the one hand through Hilbert's views that concern formalism and intuition applied for outer inquiry, and on the other hand through Brouwer's point of view that concerns intuition applied for inner inquiry or, as I call, self-inquiry. It is to demonstrate the essence of such investigations, especially self-inquiry (inward intuition), I find it compelling to report that a careful analysis of Cauchy's statements for the definition of derivative, as applied in physics, unveils the connection to the Heisenberg uncertainty principle as a condition for the failure of classical mechanics. Such logico-linguistic, or semantically driven, self-inquiry of physics can provide new insights to physicists in the pursuit of truth and reality, for example, in the context of Schroedinger equation. I point out an explicit dilemma that plagues the semantics of physics, as far as general relativity and quantum mechanics are concerned, which needs to be taken into account during any attempt to pen down a theory of ``quantum gravity''.
[28] vixra:2110.0143 [pdf]
A Logico-Linguistic Inquiry Into the Foundations of Physics: Part I
Physical dimensions like ``mass'', ``length'', ``charge'', represented by the symbols $[M], [L], [Q]$, are {\it not numbers}, but used as {\it numbers} to perform dimensional analysis in particular, and to write the equations of physics in general, by the physicist. The law of excluded middle falls short of explaining the contradictory meanings of the same symbols. The statements like ``$m\to 0$'', ``$r\to 0$'', ``$q\to 0$'', used by the physicist, are inconsistent on dimensional grounds because ``$ m$'', ``$r$'', ``$q$'' represent {\it quantities} with physical dimensions of $[M], [L], [Q]$ respectively and ``$0$'' represents just a number -- devoid of physical dimension. Consequently, the involvement of the statement ``$\lim_{q\to 0}$, where $q$ is the test charge'' in the definition of electric field, leads to either circular reasoning or a contradiction regarding the experimental verification of the smallest charge in the Millikan-Fletcher oil drop experiment. Considering such issues as problematic, by choice, I make an inquiry regarding the basic language in terms of which physics is written, with an aim of exploring how truthfully the verbal statements can be converted to the corresponding physico-mathematical expressions, where ``physico-mathematical'' signifies the involvement of physical dimensions. Such investigation necessitates an explanation by demonstration of ``self inquiry'', ``middle way'', ``dependent origination'', ``emptiness/relational existence'', which are certain terms that signify the basic tenets of Buddhism. In light of such demonstration I explain my view of ``definition''; the relations among quantity, physical dimension and number; meaninglessness of ``zero quantity'' and the associated logico-linguistic fallacy; difference between unit and unity. Considering the importance of the notion of electric field in physics, I present a critical analysis of the definitions of electric field due to Maxwell and Jackson, along with the physico-mathematical conversions of the verbal statements. The analysis of Jackson's definition points towards an expression of the electric field as an infinite series due to the associated ``limiting process'' of the test charge. However, it brings out the necessity of a postulate regarding the existence of charges, which nevertheless follows from the definition of quantity. Consequently, I explain the notion of {\it undecidable charges} that act as the middle way to resolve the contradiction regarding the Millikan-Fletcher oil drop experiment. In passing, I provide a logico-linguistic analysis, in physico-mathematical terms, of two verbal statements of Maxwell in relation to his definition of electric field, which suggests Maxwell's conception of dependent origination of distance and charge (i.e. $[L]\equiv[Q]$) and that of emptiness in the context of relative vacuum (in contrast to modern absolute vacuum). This work is an appeal for the dissociation of the categorical disciplines of logic and physics and on the large, a fruitful merger of Eastern philosophy and Western science. Nevertheless, it remains open to how the reader relates to this work, which is the essence of emptiness.
[29] vixra:2110.0142 [pdf]
Logic, Philosophy and Physics: a Critical Commentary on the Dilemma of Categories
I provide a critical commentary regarding the attitude of the logician and the philosopher towards the physicist and physics. The commentary is intended to showcase how a general change in attitude towards making scientific inquiries can be beneficial for science as a whole. However, such a change can come at the cost of looking beyond the categories of the disciplines of logic, philosophy and physics. It is through self-inquiry that such a change is possible, along with the realization of the essence of the middle that is otherwise excluded by choice. The logician, who generally holds a reverential attitude towards the physicist, can then actively contribute to the betterment of physics by improving the language through which the physicist expresses his experience. The philosopher, who otherwise chooses to follow the advancement of physics and gets stuck in the trap of sophistication of language, can then be of guidance to the physicist on intellectual grounds by having the physicist's experience himself. In course of this commentary, I provide a glimpse of how a truthful conversion of verbal statements to physico-mathematical expressions unravels the hitherto unrealized connection between Heisenberg uncertainty relation and Cauchy's definition of derivative that is used in physics. The commentary can be an essential reading if the reader is willing to look beyond the categories of logic, philosophy and physics by being `nobody'.
[30] vixra:2110.0035 [pdf]
Index Type Hand Symbol(2)
We apply the principle of index type keyboard to hand sign. Because the learning is very easy, not only a person without hearing but also a person with that can make full use of it.
[31] vixra:2109.0146 [pdf]
Reckoning Dimensions
In this article, we seek an alternative avenue--in contrast to the conventional hypercube approach--to reckon physical or abstract dimensions from an information perspective alone. After briefly reviewing ``bit'' and ``quantum of information--it'', we propose a scheme to perceive higher dimensions using bits and concentric spherical shells that are intrinsically entangled.
[32] vixra:2109.0145 [pdf]
Matter, Consciousness, and Causality--Space, Time, Measurement, and more
In this article, we refine the elements of physics. We consider [primordial] matter and consciousness as eternal and the causes of the creation of the universe via causality. We regard causality as the fundamental and ecumenical principle of theuniverse. Furthermore, we define space and time in terms of cause and effect, and revisitother important notions in physics.
[33] vixra:2107.0144 [pdf]
Index Type Hand Symbol
We apply the principle of index type keyboard to hand sign. Because the learning is very easy, not only a person without hearing but also a person with that can make full use of it.
[34] vixra:2106.0047 [pdf]
Design and Analysis of a Multiband Fractal Antenna for Applications in Cognitive Radio Technologies
Rapid development in wireless communication systems and an increase in the number of users of wireless devices is bound to result in spectrum shortage in the near future. The concept of Cognitive radio is envisaged to be a paradigm of new methodologies for achieving performance enhanced radio communication system through an efficient utilization of available spectrum. Research on antenna design is very critical for the implementation of cognitive radio. A special antenna is required in cognitive radio for sensing and communication purposes. This papers investigates the use of multiband fractal antennas for spectrum sensing application in cognitive radio units. The performance of a new fractal antenna design which generates four bands of operation in the range of 900-4000 MHz has also been studied. Through a thorough discussion on its return loss and radiation plots as well as other parameters such as gain and radiation efficiency, it is proved that the it is a promising antenna for future cognitive radio systems
[35] vixra:2106.0046 [pdf]
Triple Band Antenna Design for Bluetooth, WLAN and WiMAX Applications
A novel and compact tri-band planar antenna for 2.4/5.2/5.8-GHz wireless local area network (WLAN), 2.3/3.5/5.5- GHzWorldwide Interoperability for Microwave Access (WiMAX) and Bluetooth applications is proposed and studied in this paper. The antenna comprises of a L-shaped element which is coupled with a ground shorted parasitic resonator to generate three resonant modes for tri-band operation. The L-shaped element which is placed on top of the substrate is fed by a 50 microstrip feed line and is responsible for the generation of a wide band at 5.5 GHz. The parasitic resonator is placed on the other side of the substrate and is directly connected to the ground plane. The presence of the parasitic resonator gives rise to two additional resonant bands at 2.3 GHz and 3.5 GHz. Thus, together the two elements generate three resonant bands to cover WLAN, WiMAX and Bluetooth bands of operation. A thorough parametric study has been performed on the antenna and it has been found that the three bands can be tuned by varying certain dimensions of the antenna. Hence, the same design can be used for frequencies in adjacent bands as well with minor changes in its dimensions. Important antenna parameters such as return loss, radiation pattern and peak gains in the operating bands have been studied in detail to prove that the proposed design is a promising candidate for the aforementioned wireless technologies.
[36] vixra:2105.0137 [pdf]
One Can not Observe the Impact with Detector the Zero Cross-Section Dark Matter Particle: it is Invisible Matter
The indirect detection of Dark Matter is the gravitational anomalies in the cosmos, e.g. flat rotation curves in galaxies. The leading journals explain the lack of direct detection by the very small Impact Cross Section of the Dark Matter compounds. I argue that in the case of Particle Dark Matter the cross-section is infinitely small, so can never be directly detected. In such a case I would use the term "Dark Matter of Virtual Particles". The representative of it is the hypothetical sterile neutrino. I am not limiting my research by the Particle Dark Matter model.
[37] vixra:2105.0072 [pdf]
A Novel Compact Tri-Band Antenna Design for WiMAX, WLAN and Bluetooth Applications
A novel and compact tri-band planar antenna for 2.4/5.2/5.8-GHz wireless local area network (WLAN), 2.3/3.5/5.5GHz Worldwide Interoperability for Microwave Access (WiMAX) and Bluetooth applications is proposed and studied in this paper. The antenna comprises of a L-shaped element which is coupled with a ground shorted parasitic resonator to generate three resonant modes for tri-band operation. The L-shaped element which is placed on top of the substrate is fed by a 50ohm microstrip feed line and is responsible for the generation of a wide band at 5.5 GHz. The parasitic resonator is placed on the other side of the substrate and is directly connected to the ground plane. The presence of the parasitic resonator gives rise to two additional resonant bands at 2.3 GHz and 3.5 GHz. Thus, together the two elements generate three resonant bands to cover WLAN, WiMAX and Bluetooth bands of operation. A thorough parametric study has been performed on the antenna and it has been found that the three bands can be tuned by varying certain dimensions of the antenna. Hence, the same design can be used for frequencies in adjacent bands as well with minor changes in its dimensions. Important antenna parameters such as return loss, radiation pattern and peak gains in the operating bands have been studied in detail to prove that the proposed design is a promising candidate for the aforementioned wireless technologies.
[38] vixra:2104.0152 [pdf]
Is Our World an Intelligent Simulation?
Elon Musk seems to believe that our world is an intelligent simulation; that part of our world is simulated (part A), and part is not (part B): it is like augmented reality, made by highly advanced beings. I argue that part B is a galaxy, but part A is the Dark Matter surrounding that galaxy.
[39] vixra:2104.0151 [pdf]
Natural Boundaries
My understanding of modern physical discoveries that does not modify the existing equations. Only the values become bounded, to avoid infinities and singularities: "the sand a boundary for the sea, an everlasting barrier it cannot cross. The waves may roll, but they cannot prevail; they may roar, but they cannot cross it." Jeremiah 5:22 NIV.
[40] vixra:2012.0108 [pdf]
Tachyons for Interstellar Communication
Concerns that tachyons, which have imaginary mass, may violate causality have been been discussed in the context of two distinct embodiments for constructing a message loop. One employs transmitters in motion relative to receivers, while the other has transmitters and receivers at rest with each other and messages are passed between moving observers using electromagnetic signals. The latter (Method II) is of interest only to those who seek to disprove the existence of faster-than-light phenomena by constructing hypothetical thought experiments based solely upon kinematics that purportedly violate causality, often by specious means. The former (Method I), on the other hand, is based upon the wider foundation of both kinematics and dynamics, and sound analysis proves that causality is not violated. For Method I, the relative speed between transmitter and receiver limits the propagation speed according to u = c^2/v, where u is the maximum possible propagation speed and v is the relative speed between transmitter and receiver. This paper discusses this paradigm for communicating between outposts in different star systems. Techniques will be discussed for increasing propagation speed beyond that limited by the relative motion between earth and a planetary base in orbit around a distant star.
[41] vixra:2008.0023 [pdf]
Aristotle's View on Physics and Philosophy
Aristotle, one of renowned Greek philosophers made a huge contribution for the good foundation of symbolic logic and scientific thinking to Western Philosophy in addition to making an advancement in the field of ``Metaphysics''. He was probably the first who was serious in ``Virtue Ethics'' theory. These contributions made him possibly the most important philosopher till 18th century. This article gives a brief overview of his ideas on Physics and Philosophy.
[42] vixra:2007.0106 [pdf]
Contradictions, Mathematical Science and Incompleteness
Do you believe that science is based on contradictions? Let me consider the common experience of seeing a dot of the pencil on a paper. If I call the dot ``zero length dimension'' or ``zero extension'', then certainly I have seen `nothing'. But, if I have seen `nothing', I wonder how I can refer to `nothing', let alone naming `nothing' as ``a point''. Therefore, the expression ``zero length dimension'' is a contradiction. Mathematical science, as of now, is based on this contradiction that results from the attitude of exactness, because exact ``zero'' is non-referable and inexpressible. Such attitude leads to incomplete statements like ``infinitesimal quantities'' which never mention ``with respect to'' what. Consequently, as I find, science becomes fraught with singularity. I avoid this contradiction, by accepting my inability to do exact science. Therefore, I consider the dot as of ``negligible length dimension''. It is a practical statement rather than a sacrosanct axiom. The practicality serves the purpose of drawing geometry, that becomes impossible if I decide or choose to look at the dot through a magnifying glass. It then answers a different practical question, namely, what the dot is made up of. Certainly, reality of the dot depends on how I choose or decide to observe it. This is the essence of ``relational existence''. On the contrary, modern mathematical science is founded upon belief of ``independent existence''(invariant). My belief in inexact mathematical science and relational existence needs the introduction of an undecidable length unit to do arithmetic and leads to non-singular gravity. Further, the quest for justification of my choice or decision leads to my incompleteness -- ``I''-- the undecidable premise beyond science, the expression of which is a (useful) contradiction in itself as ``I'' is inexpressible.
[43] vixra:1911.0437 [pdf]
Impossibility of Gravitons and bi-Metric Gravity; Riemann Hypothesis Confirmed; Energy Localization Problem Solved; the Falsifiability of Science is Demonstrated
Paper ``in trend''~\cite{Meissner} talks also about gravitons (at least word ``gravitino'' is in abstract). Gravitons are gravitational force transmitors, but there is no force of Gravity in General Relativity. And how it could be in any adequate theory, if free falling body feels no dragging force (but the weightlessness). So, the paper just adds up to general misunderstanding. Latter is positioned~\cite{drive} as the driving engine of science (like the radioactive mutations in Biology), so the question arises: how many papers are a bit wrong?
[44] vixra:1908.0556 [pdf]
The Space of Unsolvable Tasks. Formulation of the Problem. or Anti-Tank Hyper-Hedgehogs in N-Dimensional Space.
With a narrow specialization of scientists, the development of science leads to the rapid growth of the space of unsolvable tasks, that grows faster than the area of existing scientific knowledge. The practical development of the field of unsolvable tasks is possible only by the forces of universal scientists of the future, who must have a high level of scientific knowledge in several general scientific disciplines.
[45] vixra:1908.0553 [pdf]
規範倫理学における公理について
I wrote about the idea that absolute ethical laws could be found in an axiomatic way in normative ethics, and wrote a self-objection and an improvement against it. Before it, I wrote reviews about the concept of axiom and the major ideas of normative ethics.
[46] vixra:1908.0516 [pdf]
Developing an Integrative Framework of Artificial Intelligence and Blockchain for Augmenting Smart Governance
Government systems are often slow, opaque and prone to corruption. The public benefits system, in general, suffers from slowness and bureaucracy. In this paper, we propose a system that utilizes blockchain and artificial intelligence techniques for governance, which enables the government to function more efficiently and with more transparency, thus increasing the level of trust the people have in their government and in democracy. Public Distributed-Ledger Ethereum (MainNet) is the back-bone of this proposed system. SHA-256 of Elliptic-Curve cryptography generates a Public-Private keypair. Each transaction is validated by P2SH and the consensus is achieved through Proof-of-Work Algorithm. A smart contract encodes the algorithm and enforces constraints on users activity. Artificial Intelligence is used to analyze the data wherever necessary and the output of the network is used as a trigger for activating the smart contract, which can be connected via IoT services and automation devices. This can be used for making government contracts more transparent. Some use cases are automatic payments based on achieved deadlines and allowing public consensus on government policies.Other applications include a better functioning public benefits system that allows the government to directly provide the public with incentives rather than relying on middlemen. Decentralization via blockchain is a complete end-to-end solution for democratizing the current systems.
[47] vixra:1907.0062 [pdf]
A Question About the Consistency of Bell's Correlation Formula
In the paper it is demonstrated that two equally consistent but conflicting uses of sign functions in the context of a simple probability density shows that Bell's formula is based on only one consistent principle. The two conflicting principles give different result. However, according to use of powers, i.e. $3\times (1/2)= (1/2)\times 3$, one must have the same result in both cases.
[48] vixra:1904.0321 [pdf]
The Unprecedented Decade
In response to various reports of ongoing crises throughout the world, this essay has been written with the aim of proposing a radical transition in the way the world currently operates. Through general observation, the case presented below posits that human labour is insufficient to provide the means of modern lifestyles, and that current economic systems are incompatible with a sustainable and decent human lifestyle due to this insufficient productivity. To compensate, mechanised labour has been produced and implemented to offset this insufficiency, but at the cost of the environment and a growing human insolvency. To avoid economic and ecological disaster, this essay posits that human labour must be abandoned and replaced by sustainably powered and automated labour worldwide, simultaneously fulfilling the various global demands freely and obsoleting emissions-intensive mechanised labour. Doing so would eliminate economic contentions that prevent many from attaining a decent quality of life while also addressing the issue of heavily polluting industries.
[49] vixra:1903.0207 [pdf]
Cellular Automaton Graphics(6)
Developping a regular polyhedron on a plane, setting discrete coordinates on the development and applying a boundary condition of regular polyhedron to it, we realize a symmetrical graphics.
[50] vixra:1902.0016 [pdf]
The Primary Factors of Biological Evolution
The article discusses the theory of biological evolution. The concepts "primary factors of evolution" (natural factors) and "secondary factors of evolution" (the result of evolution itself) are defined. The terms of the theory of evolution "struggle for existence", "selection" and others are considered from the point of view of the interpretation of facts. In order for the theory of evolution to be complete and as objective as possible, it must be based on primary factors, and interpretations should be kept to a minimum. The article discusses Darwin's theory and the modern theory of evolution in the context of these problems. An attempt is made to eliminate the concept of "the struggle for existence", which leads to the following results. A new concept of "realizing the purpose to exist" has been obtained, which is an analogue of the concept of "the struggle for existence (in a wide sense)". It is substantiated that the realization of the purpose to exist is the result of evolution (that is, the secondary factor), is the main characteristic of all living organisms (can be considered as the primary factor in the context of the living), that is, distinguishes the living from the nonliving. Realization of the purpose to exist in some conditions may take the form of a struggle, which in everyday life is usually understood as a struggle for existence (in the narrow sense). It is substantiated that such behavior is an adaptation that appeared in the process of evolution and can be regulated by means of more complex behavior, which is also an adaptation. The physical bases of biological evolution are also considered on the basis of external measures of existence.
[51] vixra:1812.0240 [pdf]
Reversing Teerac
2016 is filled with what seems like a new Ransomware every day, whether the influx is due to the recent sale of the CryptoWall source code or the actors involved in Dyre have since moved on to something profitable after the reported takedown, it would appear that for the time being pushing Ransomware is the new hip thing in the malware world. Most of the big names in Ransomware have had plenty of papers and research done but lots of the newer variants while possibly being based on either leaked or sold code will more often than not make changes in order to make themselves unique. Teerac which is a variant of TorrentLocker with a subdomain generation feature to the hardcoded domain is no exception to this as the malware matches multiple reports on TorrentLocker with the exception of an added subdomain generation.
[52] vixra:1806.0378 [pdf]
On the Origin of Extraterrestrial Industrial Civilizations
Recent discovery of billions of habitable planets within the Milky Way alone and a practical route to nuclear fusion using Project PACER approach, suggesting that any habitable planet with intelligent life should be able to expand beyond their home planet and colonize the galaxy within a relatively short time. Given the absence of detection by SETI for the past few decades, we take this result for granted that no other industrial civilization exists within the galaxy and validated the rare earth and rare intelligence hypothesis by using rigorous astronomical and geological filter to reduce the potential candidate pool to host civilization < 1 per galaxy. So that, the total number of habitable extraterrestrial planets within the Milky Way capable of supporting advanced, intelligent life within the next 500 Myr is < 969. Most of which are earth-like orbiting around a single star with mass ranges from 0.712 to 1 solar mass. No exomoons are capable of supporting advanced life, and a negligible number of low mass binary systems (<0.712 solar mass) are habitable. Among these habitable, the emergence of intelligence is still rare and must be a relatively recent phenomena. Abstract By specifying species as a combination and permutation of traits acquired through evolutionary time, multi-nominal distribution profile of species can be constructed. Those with fewer traits are the most common. A particular multi-nominal distribution is build to model the emergence of civilization by specifying homo sapiens as an outlier. The deviation is calculated based on known cranial capacity of homo sapiens and the explosive growth of angiosperm. The multi-nominal distribution is then transformed/approximated into a more manipulative, generalized multivariate time-dependent exponential lognormal distribution to model biological evolution from the perspective of man. Abstract Most surprisingly, given that the emergence chance of civilization decreases exponentially into the past, predicted by the distribution model, a wall of semi-invisibility exists due to relativistic time delay of signal arrival at cosmological distance so that the universe appears empty even if a significant portion of the space could have already been occupied. The nearest extraterrestrial industrial civilization lies at least 51.85 million light years away, and possibly at least 100 million light years or beyond. Based on the starting model, no extraterrestrial civilization arises before 119 Mya within the observable universe, and no extraterrestrial civilization arises before 138 Mya within the universe by co-moving distance. Despite great distances between the nearest civilizations and the low probability of emergence within our vicinity, given the sheer size of the universe, the total number of intelligent extraterrestrial civilizations likely approaches infinity or \left(\frac{1}{4.4\cdot10^{7}}\right)^{3}\cdot3.621\cdot10^{6}\cdot10^{10^{10^{122}}} if the universe is finitely bounded. Based on incentives for economic growth and assuming wormhole shortens cosmic distances, all civilizations tend to expands near the speed of light and will eventually universally connect with each other via wormhole networks. Within such a network, the farthermost distances traversable from earth can be either infinite or 3.621\cdot10^{6}\cdot10^{10^{10^{122}}}light years in radius if the universe is finitely bounded. Abstract This work distinguishes from and enhances previous works on SETI by focusing on the biological and statistical aspect of the evolution of intelligence, statistical distributions can serve as indispensable tools for SETI to model the pattern and behavior of civilization's emergence and development and bridging the inter-disciplinary gap between astrophysical, biological, and social aspects of extraterrestrial study.
[53] vixra:1805.0310 [pdf]
The Logic of Elements of Reality
We define the logic of elements of reality. The logic of elements of reality is not a logic in the classical sense. It is an abstract language for constructing models of a certain kind. In part, it corresponds to the language of propositional logic. We define the logic of elements of reality on arbitrary sets of elements of reality. The basic relation between arbitrary elements of reality p, q is the relation p |> q (if there exists p, then there exists q). We consider the physical space and the property: if p |> q then takes place E(p) >= E(q) (E() is energy). For strongly deterministic spaces the law of energy conservation is described as follows: from p |> q, q |> p it follows E(p) = E(q).
[54] vixra:1804.0350 [pdf]
... back to Enchantment ... ? Donald Rumsfeld : A View of the World ?
The issues considered are as follows : \begin{itemize} \item The long ongoing and by now significant disenchantment with any religion among a significant part of Western humans. \item A proposal to reinstate a general enough awareness among the educated Westerners of the essential role of transcendental realms in the day to day life of humanity. The concept of the UNKNOWN UNKNOWN, or briefly (UU), introduced by Donald Rumsfeld in 2002 is suggested to be made use of. He introduced somewhat in passing and for a far more particular issue into public discourse this concept. However, it appears that the concept of (UU) can play a basic role in building up a new and general enough awareness of the transcendental and its essential and permanent role in human affairs. The present essay starts with as brief as possible consideration of the concept of (UU). \item Possible ``Commentaries" on the (UU) follow, presented in subsections a) to r). This part of the essay is obviously open to further contributions. \item Ways to a possible return to ENCHANTMENT in our human view of reality - ways we have lost in our modern days - a suggested. \item The truly fundamental issue of our usual conception of ``Time" is briefly presented, underlying the dramatic limitation which it imposes upon all the rest of our views of reality. The importance of that issue is hard to overstate, in spite of the fact that in the ever ongoing and often accelerating rush of everyday life, hardly anyone notices, let alone may be ready to stop for a while and wonder about it ... \item the Essay end with an ``Appendix" which, together with the respective indicated literature, shows that - much contrary to the general perception - there is even today a significant concern and research regarding the possible structures, far far beyond the simplicity of those in general human awareness, structures which may be involved in the concept of ``Time". \end{itemize}
[55] vixra:1803.0707 [pdf]
Fine-Structure Constant from Golden Ratio Geometry
After a brief review of the golden ratio in history and our previous exposition of the fine-structure constant and equations with the exponential function, the fine-structure constant is studied in the context of other research calculating the fine-structure constant from the golden ratio geometry of the hydrogen atom. This research is extended and the fine-structure constant is then calculated in powers of the golden ratio to an accuracy consistent with the most recent publications. The mathematical constants associated with the golden ratio are also involved in both the calculation of the fine-structure constant and the proton-electron mass ratio. These constants are included in symbolic geometry of historical relevance in the science of the ancients.
[56] vixra:1803.0402 [pdf]
Letter to a Friend on Rumsfeld's Unknown Unknown ...
The essay is about how to recover the vast majority of the present day Western world from their aggressive secularism, and do so with the help of the concept of UNKNOWN UNKNOWN introduced - rather accidentally - back in 2002 by Donald Rumsfeld ...
[57] vixra:1712.0430 [pdf]
Proposed Civilization Scale
Instead of measuring a civilization's level of technological advancement based on energy consumption point of view, I believe it is more appropriate on it's capability of commanding the cycle and recycle of energy and matter; or based on the knowledge and technology of mastering the energy and matter: <br><ol><li>Type 0 Civilization: Parasitic; </li><li>Type I: Energy Mastery; </li><li>Type II: Energy and Matter Mastery, achieving mastery of energy and matter. </li></ol>
[58] vixra:1712.0134 [pdf]
Emergence of the Laws of Nature in the Developing Universe 1a
Evolution of our universe with continuous production of matter by the vacuum, is described. The analysis is based on the quantum modification of the general relativity (Qmoger), supported by the cosmic data without fitting. Various types of matter are selected by the vacuum in accordance with stability of the developing universe. All laws of nature are emergent and approximate, including the conservation of energy. The (3+1)-dimensional space-time and gravity were selected first. Than came quantum condensate of gravitons (dark matter). Photons and other ordinary matter were selected much later during formation of galaxies, when the background condensate becomes gravitationally unstable. The effect of radiation on the global dynamics is described in terms of conservation of the enthalpy density. Mass of neutrino (as the first massive fermionic particle) is estimated, in accord with experimental bound. The electric dipole moment of neutrino is also estimated. The oscillations of neutrinos are explained in terms of interaction with background condensate. The phenomena of subjective experiences are also explained in terms of interaction of the action potentials of neurons with the background dipolar condensate, which opens a new window into the dark sector of matter. The Qmoger theory goes beyond the Standard Model and the Quantum Field Theory, but can be combined with their achievements. Key words: quantum modification of general relativity, emergence of the laws of nature, isenthalpic universe, oscillating neutrinos, subjective experiences and dark sector of matter.
[59] vixra:1706.0382 [pdf]
Breaking a Multi-Layer Crypter Through Reverse-Engineering, a Case Study Into the Man1 Crypter
Crypters and packers are common in the malware world, lots of tech- niques have been invented over the years to help people bypass security measures commonly used. One such technique where a crypter will use multiple, sometimes dynamically generated, layers to decode and unpack the protected executable allows a crypter to bypass common security mea- sures such as Antivirus. While at the end of this paper we will have con- structed a working proof of concept for an unpacker it is by no means meant as a production level mechanism, the goal is simply to show the reversing of routines found in a crypter while using a reverse-engineering framework that is geared towards shellcode analysis to our benefit for malware analysis.
[60] vixra:1706.0377 [pdf]
Dissecting the Dyre Loader
Dyre or Dyreza, is a pretty prominent figure in the world of financial malware. The Dyre of today comes loaded with a multitude of mod- ules and features while also appearing to be well maintained. The first recorded instance of Dyre I have found is an article in June 2014 and the sample in question is version 1001, while at the time of this report Dyre is already up to version 1166. While the crypters and packers have varied over time, for at least the past 6 months Dyre has used the same loader to perform it’s initial checks and injection sequence. It is the purpose of this report to go through the various techniques and algorithms present in the loader, and at times reverse them to python proof of concepts.
[61] vixra:1706.0004 [pdf]
On the "Mysterious" Effectiveness of Mathematics in Science
This paper notes first that the effectiveness of mathematics in science appears to some writers to be "mysterious" or "unreasonable". Then reasons are given for thinking that science is, at root, the search for compression in the world. At more length, several reasons are given for believing that mathematics is, fundamentally, a set of techniques for compressing information and their application. From there, it is argued that the effectiveness of mathematics in science is because it provides a means of achieving the compression of information which lies at the heart of science. The anthropic principle provides an explanation of why we find the world - aspects of it at least - to be compressible. Information compression may be seen to be important in both science and mathematics, not only as a means of representing knowledge succinctly, but as a basis for scientific and mathematical inferences - because of the intimate relation that is known to exist between information compression and concepts of prediction and probability. The idea that mathematics may be seen to be largely about the compression of information is in keeping with the view, supported by evidence that is outlined in the paper, that much of human learning, perception, and cognition may be understood as information compression. That connection is itself in keeping with the observation that mathematics is the product of human ingenuity and an aid to human thinking.
[62] vixra:1706.0003 [pdf]
On the "Mysterious" Effectiveness of Mathematics in Science
This paper notes first that the effectiveness of mathematics in science appears to some writers to be "mysterious" or "unreasonable". Then reasons are given for thinking that science is, at root, the search for compression in the world. At more length, several reasons are given for believing that mathematics is, fundamentally, a set of techniques for information compression via the matching and unification of patterns (ICMUP), and their application. From there, it is argued that the effectiveness of mathematics in science is because it provides a means of achieving the compression of information which lies at the heart of science. The anthropic principle provides an explanation for why we find the world -- aspects of it at least -- to be compressible. ICMUP may be seen to be important in both science and mathematics, not only as a means of representing knowledge succinctly, but as a basis for scientific and mathematical inferences -- because of the intimate relation that is known to exist between information compression and concepts of prediction and probability. Since ICMUP is a key part of the "SP theory of intelligence", evidence presented in this paper strengthens the already-strong evidence for the SP theory as a unifying principle across artificial intelligence, mainstream computing, mathematics, human learning, perception, and cognition, and neuroscience. The evidence and ideas in this paper may provide the basis for a "new mathematics for science" with potential benefits and applications in science and science-related areas.
[63] vixra:1705.0375 [pdf]
Holy Cosmic Condensate of Dipolar Gravitons
Quantum modification of general relativity (Qmoger) is supported by cosmic data (without fitting). Qmoger equations consist of Einstein equations with two additional terms responsible for production/absorption of matter. In Qmoger cosmology there was no Big Bang and matter is continuously producing by the Vacuum. Particularly, production of the ultralight gravitons with tiny electric dipole moment was started about 284 billion years ago. Quantum effects dominate interaction of these particles and they form the quantum condensate. Under influence of gravitation, the condensate is forming galaxies and producing ordinary matter, including photons. As one important result of this activity, it recently created us, the people, and continues to support us. Particularly, our subjective experiences are a result of an interaction between the background condensate and the neural system of the brain. The action potentials of neural system create traps and coherent dynamic patterns in the dipolar condensate. So, our subjective experiences are graviton-based, which can open new directions of research in biology and medicine.
[64] vixra:1702.0327 [pdf]
Exploring the Combination Rules of D Numbers From a Perspective of Conflict Redistribution
Dempster-Shafer theory of evidence is widely applied to uncertainty modelling and knowledge reasoning because of its advantages in dealing with uncertain information. But some conditions or requirements, such as exclusiveness hypothesis and completeness constraint, limit the development and application of that theory to a large extend. To overcome the shortcomings and enhance its capability of representing the uncertainty, a novel model, called D numbers, has been proposed recently. However, many key issues, for example how to implement the combination of D numbers, remain unsolved. In the paper, we have explored the combination of D Numbers from a perspective of conflict redistribution, and proposed two combination rules being suitable for different situations for the fusion of two D numbers. The proposed combination rules can reduce to the classical Dempster's rule in Dempster-Shafer theory under a certain conditions. Numerical examples and discussion about the proposed rules are also given in the paper.
[65] vixra:1702.0126 [pdf]
Future of Humankind in Light of New Science
Some aspects of the future of humankind are considered based on application of the quantum modification of general relativity. Particularly, the energy supply from the vacuum and a new form of communication are discussed.
[66] vixra:1701.0334 [pdf]
Matter and Energy in a Non-Relativistic Approach Amongst the Mustard Seed and the ”faith”. a Metaphysical Conclusion
The work is the result of a philosophical study of several passages of the Holy Bible, with regard to faith. We analyzed verses that include mustard seed parables. The study discusses the various concepts of faith as belief and faith as a form of energy. In this concept of faith as energy, we made a connection and this matter. We approach the gravitational field using the Law of Universal Gravitation and the equation of equivalence between energy and matter not to relativistic effects. Of Scriptures, we focus on Matthew 17:20, and according to the concept of faith as a form of energy, we calculate the energy needed to raise a mountain, for the conversion of matter to energy in a mustard seed and we compare a massive iron mountain, Mount Everest and Mount Sinai. We conclude with these concepts and considerations that energy ”faith” can move a mountain.
[67] vixra:1701.0013 [pdf]
Topology of P vs NP
This paper describes about P vs NP by using topological approach. We modify computation history as “Problem forest”, and define special problem family “Wildcard problem” and “Maximal complement Wildcard problem” to simplify relations between every input. “Problem forest” is directed graph with transition functions edges and computational configuration nodes with effective range of tape. Problem forest of DTM is two tree graph which root are accepting & rejecting configuration, which leaves are inputs, trunks are computational configuration with effective range of tape. This tree shows TM's interpretation of symmetry and asymmetry of each input. From the view of problem forest, some NTM inputs are marged partly, and all DTM inputs are separated totally. Therefore NTM can compute implicitly some type of partial (symmetry) overrap, and DTM have to compute explicitly. “WILDCARD (Wildcard problem family)” and “MAXCARD (Maximal complement Wildcard problem family)” is special problem families that push NTM branches variations into inputs. If “CONCRETE (Concrete Problem)” that generate MAXCARD is in P-Complete, then MAXCARD is in PH, and these inputs have many overrap. DTM cannot compute these overrap conditions implicitly, and these conditions are necesarry to compute MAXCARD input, so DTM have to compute these conditions explicitly. These conditions are over polynomial size and DTM take over polynomial steps to compute these conditions explicitly. That is, PH is not P, and NP is not P.
[68] vixra:1610.0146 [pdf]
Special Relativity: Scientific or Philosophical Theory?
In this article, we argue that the theory of special relativity, as formulated by Einstein, is a philosophical rather than a scientific theory. What is scientific and experimentally supported is the formalism of the relativistic mechanics embedded in the Lorentz transformations and their direct mathematical, experimental and observational consequences. This is in parallel with the quantum mechanics where the scientific content and experimental support of this branch of physics is embedded in the formalism of quantum mechanics and not in its philosophical interpretations such as the Copenhagen school or the parallel worlds explanations. Einstein theory of special relativity gets unduly credit from the success of the relativistic mechanics of Lorentz transformations. Hence, all the postulates and consequences of Einstein interpretation which have no direct experimental or observational support should be reexamined and the relativistic mechanics of Lorentz transformations should be treated in education, academia and research in a similar fashion to that of quantum mechanics.
[69] vixra:1607.0195 [pdf]
Parallel Universes and Causal Anomalies: Links Between Science and Religion
In [6] it was proposed to define "god" as a region of a universe that is subject to circular causality. While we do not adapt the exact definition of "god" that is being introduced in that paper, we do accept the concept that "god" has something to do with causal anomalies: either circular causality or else two competing causal structures. We will show that the presence of a causal anomaly (whatever it happens to be) might allow us to define trinity in non-contradictory way. That is, we will show how the members of trinity can be separate entities and, yet, have the same identity.
[70] vixra:1607.0070 [pdf]
The Existence of Quantum Computer
We extend an empirically grounded theory of the existence of quantum computer. The main question that we are considering is a question about the quantum computer’s possibility of existence and creation. As empirical evidence, we use logic, which we show in cognitive perspective. For a definition of the computer, we use a formal definition of Turing machine. By formulating many definitions abstractly and phenomenologically we go around the areas of quantum physics, quantum computing and other quantum-related fields of science which could give an unambiguous answer to our question about the essence of quantum computer and which is developed so they can’t. In many ways, it makes this theory about the existence of quantum computer universal for these areas, although less applied for them. We consider some corollary of the essences of quantum computer, including the possibility of quantum computer for the human. References to research of cognitive nature of logic suggest empirical basis which we follow in our theory.
[71] vixra:1601.0335 [pdf]
A Numerical Investigation Of The Signicance Of Non-Dimensional Numbers On The Oscillating Flow Characteristics Of A Closed Loop Pulsating Heat Pipe
Pulsating Heat Pipe (PHP) is a two phase passive heat transfer device for low temperature applications. Even though it is a simple, exible and cheap structure, its complex physics has not been fully understood and requires a robust, validated simulation tool. In the present work the basic theoretical model by H.B. Ma et al[11] has been updated with the inclusion of capillary forces in order to characterise the pulsating ow under the in uence of various non dimensional quantities. The mathematical model is solved using explicit embedded Range-kutta method and proved that the Poiseuille number considered in the numerical analysis assumes more signicance since it includes ow characteristics, geometry and uid properties of a PHP.
[72] vixra:1601.0334 [pdf]
Analytical Investigation and Numerical Prediction of Driving Point Mechanical Impedance for Driver Posture By Using ANN
In vibration human body is unied and complex active dynamic system. Lumped parameters are oered used to capture and evaluate the human dynamic properties.Entire body vibration causes a multi fascinated sharing out of vibration within the body and disagreeable feelings giving rise to discomfort or exasperation result in impaired performance and health means. This distribution of vibration is dependent on intra subject variability and inters subject variability. For this study a multi degree of freedom lumped parameter model has taken for analysis. The equation of motion is derived and the response function such as seat to head transmissibility (STHT) driving point mechanical impedance (DPMI) and apparent mass(APMS) are determined, for this kind of study we can use a neural network (ANN) which is a powerful data modeling tool that is able to capture and represent complex input/output relationship. The goal of ANN is to create a model that correctly maps the input to the output using historic data so that the model can be then used to produce the output when the desired output is unknown.
[73] vixra:1601.0333 [pdf]
Predicting and Analyzing the Efficiency of Portable Scheffer Re ector By Using Response Surface Method
Portable Scheffer reactor (PSR) is an important and useful mechanical device used solar energy for the numerous applications. The present work consists of 2.7 square meter scheffer re ector used for the domestic application in Indian context such as water heating. The parameters such as the position of the PSR surface with respect to the sun i.e. tilting angle (AR),the processing timing (TM) measured in 24 Hr clock and the water quantity (WT ) are considered as a independent parameters. The parameter related with the PSR performance is the eciency of the PSR (EFF).The response surface methodology (RSM) was used to predict and analyze the performance of PSR. The experiments conducted based on three factors, three-level, and central composite face centered design with full replications technique, and mathematical model was developed. Sensitivity analysis was carried out to identify critical parameters. The results obtained through response surface methodology were compared with the actual observed performance parameters. The results show that the RSM is an easy and effective tool for modelling and analyzing the performance of any mechanical system.
[74] vixra:1511.0009 [pdf]
A Study on the Coffee Spilling Phenomena in the Low Impulse Regime
When a glass of wine is oscillated horizontally at 4Hz, the liquid surface oscillates calmly. But when the same amount of liquid is contained in a cylindrical mug and oscillated under the same conditions, the liquid starts to oscillate aggressively against the container walls and results in significant spillage. This is a manifestation of the same principles that also cause coffee spillage when we walk. In this study, we experimentally investigate the cup motion and liquid oscillation during locomotion. The frequency spectrum of each motion reveals that the second harmonic mode of the hand motion corresponds to the resonance frequency of the first antisymmetric mode of coffee oscillation, resulting in maximum spillage. By applying these experimental findings, a number of methods to suppress resonance are presented. Then, we construct two mechanical models to rationalize our experimental findings and gain further insight; both models successfully predict actual hand behaviors.
[75] vixra:1510.0114 [pdf]
Gravity of Subjectivity
This work is based on quantum modification of the general relativity, which includes effects of production /absorption of gravitons by the vacuum. It turns out, that gravitons created and continued to influence the universe, including people. The theory (without fitting parameters) is in good quantitative agreement with cosmological observations. In this theory we got an interface between gravitons and ordinary matter, which very likely exist not only in cosmos, but everywhere, including our body and, especially, our brain. Subjective experiences are considered as a manifestation of that interface. This opens a possibility of a "communication" with gravitons. Probable applications of these ideas include health (brain stimulation), communication, computational capabilities and energy resources. Social consequences of these ideas can be comparable with the effects of invention and application of electricity.
[76] vixra:1509.0209 [pdf]
On the Millennium Prize Problems
There is Prize committee (claymath.org), which requires publication in worldwide reputable mathematics journal and at least two years of following scientific admiration. Why then the Grisha Perelman has published only in a forum (arXiv), publication was unclear as the crazy sketch; but mummy child ``Grisha'' have being forced to accept the Millennium Prize? Am I simply ugly or poor? If the following text would not be accepted by committee as the pay-able proofs (but I hope for) then let at least it builds your confidence to refer to these conjectures and problems (which now are having my answers), as the achieved facts. I see no logical problems with all these plain facts, are you with me at last? It is your free choice to be blind and discriminative ignorant or be better one. One even can ignore own breathing and, thus, die. One can ignore what's ever in this world. But it is not always recommended. Please respect my copyrights!
[77] vixra:1507.0152 [pdf]
In God We Mind or Physical Considerations of Divine
It is possible to buy this paper, your money will not be spent for entertainment and you will be rewarded in Heaven for spending your time and money to consume and promote (among your contacts and friends) the product of the Cripple author. Points for God they call not the proofs, but the ``arguments''. It is because they are illustrations of divine. As example: God exists, because word ``God'' means ``exists''. He has more right to exist than anyone else. Therefore the criticism against the arguments (main modern arguers: S.Hawking, R.Dawkins) is pointless. Dr. Marcelo Gleiser, in his article ``Hawking And God: An Intimate Relationship'' wrote: ``Maybe Hawking should leave God alone.'' The Universe could have been any. But it is the most complex in face of humans. Probability of such ``random'' event is zero. For sure, without God the complexity would be average, but not the top one.
[78] vixra:1504.0207 [pdf]
Quantum Games of Opinion Formation Based on the Marinatto-Weber Quantum Game Scheme
Quantization becomes a new way to study classical game theory since quantum strategies and quantum games have been proposed. In previous studies, many typical game models, such as prisoner's dilemma, battle of the sexes, Hawk-Dove game, have been investigated by using quantization approaches. In this paper, several game models of opinion formations have been quantized based on the Marinatto-Weber quantum game scheme, a frequently used scheme to convert classical games to quantum versions. Our results show that the quantization can change fascinatingly the properties of some classical opinion formation game models so as to generate win-win outcomes.
[79] vixra:1504.0157 [pdf]
An Quantum Extension to Inspection Game
Quantum game theory is a new interdisciplinary field between game theory and physical research. In this paper, we extend the classical inspection game into a quantum game version by quantizing the strategy space and importing entanglement between players. The quantum inspection has various Nash equilibrium depending on the initial quantum state of the game. Our results also show that quantization can respectively help each player to increase his own payoff, but can not simultaneously improve the collective payoff in the quantum inspection game.
[80] vixra:1502.0236 [pdf]
Impact of Preference and Equivocators on Opinion Dynamics with Evolutionary Game Framework
Opinion dynamics, aiming to understand the evolution of collective behavior through various interaction mechanisms of opinions, represents one of the most challenges in natural and social science. To elucidate this issue clearly, binary opinion model becomes a useful framework, where agents can take an independent opinion. Inspired by the realistic observations, here we propose two basic interaction mechanisms of binary opinion model: one is the so-called BSO model in which players benefit from holding the same opinion; the other is called BDO model in which players benefit from taking different opinions. In terms of these two basic models, the synthetical effect of opinion preference and equivocators on the evolution of binary opinion is studied under the framework of evolutionary game theory (EGT), where the replicator equation (RE) is employed to mimick the evolution of opinions. By means of numerous simulations, we show the theoretical equilibrium states of binary opinion dynamics, and mathematically analyze the stability of each equilibrium state as well.
[81] vixra:1502.0192 [pdf]
Physical Dimension of Sciences
I propose a classification of scientific fields by the place that their typical objects occupy in three-dimensional space of physical dimensions: length, mass and time - on a logarithmic scale. Classification includes some areas of physics, chemistry, biology and geology, as well as history. Natural interdisciplinary connections are established , as well as the gaps - the region of space in which there are no objects of modern science.
[82] vixra:1502.0160 [pdf]
History of Problem, Cooperstock is Wrong.
Dear readers, the picture of Physics lefts you in confusion. The prime example is refutation of black holes in 2014, Phys.Lett.B 738, 61–7 by Laura, a Professor. I have arguments against her paper, but perhaps I am the only one, who is worried. They keep bringing things forward, which are thought to be refuted and over refuted. Another example of mind blowing is the Dr. Cooperstock. First his attempt was to deny the Standards of Metrology (within "Energy Localization hypothesis"). I have arguments against his idea. Then he came up with another mind abuse: absence of long detected Dark Matter. In the following I am defending the Dark Matter from the nihilistic aggression of Dr. Cooperstock. Speaking of nihilism, the most grim picture is in Quantum Mechanics of Niels Bohr. In 2015 they have "proved" in elitist "Nature", that Shr"odinger's Cat is real. Thus, the world does not exist: a thing can not both be and not be. It is very convenient now: if even a grain of sand is crazy hallucination (like the "proven" "reality" of undead cat), then this non-existent grain needs no divine (loved, but more often hated) Creator. The reason of delusion: they have missed an intelligent factors, e.g. evil spirits, which very often act on the measuring device. Recall the wrong alarms in atomic armies.
[83] vixra:1412.0088 [pdf]
Is Entropy Enough to Evaluate the Probability Transformation Approach of Belief Function?
In Dempster-Shafer Theory (DST) of evidencee and transferable belief model (TBM), the probability transformation is necessary and crucial for decision-making. The evaluation of the quality of the probability transformation is usually based on the entropy or the probabilistic information content (PIC) measures, which are questioned in this paper. Another alternative of probability transformation approach is proposed based on the uncertainty minimization to verify the rationality of the entropy or PIC as the evaluation criteria for the probability transformation. According to the experimental results based on the comparisons among different probability transformation approaches, the rationality of using entropy or Probabilistic Information Content (PIC) measures to evaluate probability transformation approaches is analyzed and discussed.
[84] vixra:1412.0084 [pdf]
Application of Referee Functions to the Vehicle-Born Improvised Explosive Device Problem
We propose a solution to the Vehicle-Born Improvised Explosive Device problem. This solution is based on a modelling by belief functions, and involves the construction of a combination rule dedicated to this problem. The construction of the combination rule is made possible by a tool developped in previous works, which is a generic framework dedicated to the construction of combination rules. This tool implies a tripartite architecture, with respective parts implementing the logical framework, the combination definition (referee function) and the computation processes. Referee functions are decisional arbitrament conditionally to basic decisions provided by the sources of information, and allows rule definitions at logical level adapted to the application.We construct a referee function for the Vehicle-Born Improvised Explosive Device problem, and compare it to reference combinaton rules.
[85] vixra:1412.0083 [pdf]
Change Detection from Remote Sensing Images Based on Evidential Reasoning
Theories of evidence have already been applied more or less successfully in the fusion of remote sensing images. In the classical evidential reasoning, all the sources of evidence and their fusion results are related with the same invariable (static)frame of discernment. Nevertheless, there are possible change occurrences through multi-temporal remote sensing images, and these changes need to be detected efficiently in some applications. The invariable frame of classical evidential reasoning can’t efficiently represent nor detect the changes occurrences from heterogenous remote sensing images. To overcome this limitation, Dynamical Evidential Reasoning (DER) is proposed for the sequential fusion of multi-temporal images. A new state transition frame is defined in DER and the change occurrences can be precisely represented by introducing a state transition operator. The belief functions used in DER are defined similarly to those defined in the Dempster-Shafer Theory (DST). Two kinds of dynamical combination rules working in free model and constrained model are proposed in this new framework for dealing with the different cases. In the final, an experiment using three pieces of real satellite images acquired before and after an earthquake are provided to show the interest of the new approach.
[86] vixra:1412.0081 [pdf]
Edge Detection in Color Images Based on DSmT
In this paper, we present a non-supervised methodology for edge detection in color images based on belief functions and their combination. Our algorithm is based on the fusion of local edge detectors results expressed into basic belief assignments thanks to a flexible modeling, and the proportional conflict redistribution rule developed in DSmT framework. The application of this new belief-based edge detector is tested both on original (noise-free) Lena’s picture and on a modified image including artificial pixel noises to show the ability of our algorithm to work on noisy images too.
[87] vixra:1412.0075 [pdf]
A Fuzzy-Cautious OWA Approach with Evidential Reasoning
Multi-criteria decision making (MCDM) is to make decisions in the presence of multiple criteria. To make a decision in the framework of MCDM under uncertainty, a novel fuzzy -Cautious OWA with evidential reasoning (FCOWA-ER) approach is proposed in this paper. Payoff matrix and belief functions of states of nature are used to generate the expected payoffs, based on which, two Fuzzy Membership Functions (FMFs) representing optimistic and pessimistic attitude, respectively can be obtained. Two basic belief assignments (bba’s) are then generated from the two FMFs. By evidence combination, a combined bba is obtained, which can be used to make the decision. There is no problem of weights selection in FCOWA-ER as in traditional OWA. When compared with other evidential reasoning-based OWA approaches such as COWA-ER, FCOWA-ER has lower computational cost and clearer physical meaning. Some experiments and related analyses are provided to justify our proposed FCOWA-ER.
[88] vixra:1412.0074 [pdf]
Hierarchical DSmP Transformation for Decision-Making Under Uncertainty
Dempster-Shafer evidence theory is widely used for approximate reasoning under uncertainty; however, the decisionmaking is more intuitive and easy to justify when made in the probabilistic context. Thus the transformation to approximate a belief function into a probability measure is crucial and important for decision-making based on evidence theory framework. In this paper we present a new transformation of any general basic belief assignment (bba) into a Bayesian belief assignment (or subjective probability measure) based on new proportional and hierarchical principle of uncertainty reduction. Some examples are provided to show the rationality and efficiency of our proposed probability transformation approach.
[89] vixra:1412.0069 [pdf]
On The Validity of Dempster-Shafer Theory
We challenge the validity of Dempster-Shafer Theory by using an emblematic example to show that DS rule produces counter-intuitive result. Further analysis reveals that the result comes from a understanding of evidence pooling which goes against the common expectation of this process. Although DS theory has attracted some interest of the scientific community working in information fusion and artificial intelligence, its validity to solve practical problems is problematic, because it is not applicable to evidences combination in general, but only to a certain type situations which still need to be clearly identified
[90] vixra:1412.0054 [pdf]
Characterization of Hard and Soft Sources of Information: a Practical Illustration
Physical sensors (hard sources) and humans (soft sources) have complementary features in terms of perception, reasoning, memory. It is thus natural to combine their associated information for a wider coverage of the diversity of the available information and thus provide an enhanced situation awareness for the decision maker. While the fusion domain mainly considers (although not only) the processing and combination of information from hard sources, conciliating these two broad areas is gaining more and more interest in the domain of hard and soft fusion. In order to better understand the diversity and specificity of sources of information, we propose a functional model of a source of information, and a structured list of dimensions along which a source of information can be qualified. We illustrate some properties on a real data gathered from an experiment of light detection in a fog chamber involving both automatic and human detectors.
[91] vixra:1411.0509 [pdf]
Algorithm of Nature
Numerical simulations of elementary gravitation and electromagnetic elds are done with an amazingly simple algorithm. But this algorithm renders nature correctly. As well, the material world is revealed to be completely Riemannian-geometrical without exception. Mathematics is based on the Geometric theorie of elds, which refers to Einstein and Rainich. The correctness of the theory is manifested in it that known particles appear as discrete solutions of geometric eld equations. The results involve new understanding of mathematical principles.
[92] vixra:1411.0499 [pdf]
Interval-Valued Neutrosophic Soft Sets and Its Decision Making
In this paper, the notion of the interval valued neutrosophic soft sets (ivn−soft sets) is defined which is a combination of an interval valued neutrosophic sets [36] and a soft sets [30]. Our ivn−soft sets generalizes the concept of the soft set, fuzzy soft set, interval valued fuzzy soft set, intuitionistic fuzzy soft set, interval valued intuitionistic fuzzy soft set and neutrosophic soft set. Then, we introduce some definitions and operations on ivn−soft sets sets. Some properties of ivn−soft sets which are con- nected to operations have been established. Also, the aim of this paper is to investigate the decision making based on ivn−soft sets by level soft sets. Therefore, we develop a decision making methods and then give a example to illustrate the developed approach.
[93] vixra:1411.0491 [pdf]
Neutrosophic Ideals of ¡-Semirings
Neutrosophic ideals of a ¡-semiring are introduced and studied in the sense of Smarandache[14], along with some operations such as intersection, composition, cartesian product etc. on them. Among the other results/characterizations, it is shown that all the operations are structure preserving.
[94] vixra:1411.0488 [pdf]
Neutrosophic Soft Relations and Some Properties
In this work, we rst dene a relation on neutrosophic soft sets which allows to compose two neutrosophic soft sets. It is devised to derive useful information through the composition of two neutrosophic soft sets. Then, we examine symmetric, transitive and reexive neutrosophic soft relations and many related concepts such as equivalent neutrosophic soft set relation, partition of neutrosophic soft sets, equivalence classes, quotient neutrosophic soft sets, neutrosophic soft composition are given and their propositions are discussed. Finally a decision making method on neutrosophic soft sets is presented.
[95] vixra:1411.0485 [pdf]
Possibility Neutrosophic Soft Sets with Applications in Decision Making and Similarity Measure
In this paper, concept of possibility neutrosophic soft set and its operations are defined, and their properties are studied. An application of this theory in decision making is investigated. Also a similarity measure of two possibility neutrosophic soft sets is introduced and discussed. Finally an application of this similarity measure in personal selection for a firm.
[96] vixra:1411.0462 [pdf]
Neutrosophic Soft Semirings
The purpose of this paper is to study semirings and its ideals by neutrosophic soft sets. After noting some preliminary ideas for subsequent use in Section 1 and 2, I have introduced and studied neutrosophic soft semiring, neutrosophic soft ideals, idealistic neutrosophic soft semiring, regular (intra-regular) neutrosophic soft semiring along with some of their characterizations in Section 3 and 4. In Section 5, I have illustrated all the necessary definitions and results by examples.
[97] vixra:1411.0461 [pdf]
Neutrosophic Soft Sets and Neutrosophic Soft Matrices Based on Decision Making
Maji[32], firstly proposed neutrosophic soft sets can handle the indeterminate information and inconsistent information which exists commonly in belief systems. In this paper, we have firstly redefined complement, union and compared our definitions of neutrosophic soft with the definitions given by Maji. Then, we have introduced the concept of neutrosophic soft matrix and their operators which are more functional to make theoretical studies in the neutrosophic soft set theory.
[98] vixra:1411.0460 [pdf]
Neutrosophic Soft Sets with Applications in Decision Making
We firstly present definitions and properties in study of Maji [11] on neutrosophic soft sets. We then give a few notes on his study. Next, based on C¸ agman [4], we redefine the notion of neutrosophic soft set and neutrosophic soft set operations to make more functional. By using these new definitions we construct a decision making method and a group decision making method which selects a set of optimum elements from the alternatives. We finally present examples which shows that the methods can be successfully applied to many problems that contain uncertainties.
[99] vixra:1411.0454 [pdf]
Similarity Measure Between Possibility Neutrosophic Soft Sets and Its Applications
In this paper, a similarity measure between possibility neutrosophic soft sets (PNS-set) is dened, and its properties are studied. A decision making method is established based on proposed similarity measure. Finally, an application of this similarity measure involving the real life problem is given.
[100] vixra:1411.0449 [pdf]
Subsethood Measure for Single Valued Neutrosophic Sets
The main aim of this paper is to introduce a neurosophic subsethood measure for single valued neutrosophic sets. For this purpose, we rst introduce a system of axioms for subsethood measure of single valued neutrosophic sets. Then we give a simple subsethood mea- sure based to distance measure. Finally, to show eectiveness of intended subsethood measure, an application is presented in multicriteria decision making problem and results obtained are discussed. Though having a simple measure for calculation, the subsethood measure presents a new approach to deal with neutrosophic information.
[101] vixra:1411.0444 [pdf]
Florentin Smarandache: A Celebration
We celebrate Prof. Florentin Smarandache, the Associate Editor and co-founder of Progress in Physics who is a prominent mathematician of the 20th/21th centuries. Prof. Smarandache is most known as the founder of neutrosophic logic, which is a modern extension of fuzzy logics by introducing the neutralities and denials (such as “neutral A” and “non-A” between “A” and “anti-A”). He is also known due to his many discoveries in the filed of pure mathematics such as number theory, set theory, functions, etc. (see many items connected with his name in CRC Encyclopedia of Mathematics). As a multi-talented person, Prof. Smarandache is also known due to his achievements in the other fields of science, and also as a poet and writer. He still work in science, and continues his creative research activity.
[102] vixra:1411.0443 [pdf]
Generalized Exponential Type Estimator for Population Variance in Survey Sampling
In this paper, generalized exponential-type estimator has been proposed for estimating the population variance using mean auxiliary variable in singlephase sampling. Some special cases of the proposed generalized estimator have also been discussed.
[103] vixra:1411.0432 [pdf]
Advances in DS Evidence Theory and Related Discussions
Based on the review of the development and recent advances in model, reasoning, decision and evaluation in evidence theory, some analyses and discussions on some problems, confusions and misunderstandings in evidence theory are provided together with some related numerical examples in this paper. The relations between evidence theory and probability theory, the evidence conflict and related counter-intuitive results, some definitions on distance of evidence and the evaluation criteria in evidence theory are covered in this paper. The future developing trends of evidence theory are also analyzed. This paper aims to provide reference for the correctly understanding and using of evidence theory.
[104] vixra:1411.0431 [pdf]
An Airplane Image Target0s Multi-feature Fusion Recognition Method
This paper proposes an image target0s multi-feature fusion recognition method based on probabilistic neural networks (PNN) and Dezert-Smarandache theory (DSmT). To aim at multiple features extracted from an image, the information from them is fused. Firstly, the image is preprocessed with binarization and then multiple features are extracted, such as Hu moments, normalized moment of inertia, a±ne invariant moments, discrete outline parameters and singular values.
[105] vixra:1411.0428 [pdf]
Combining Sources of Evidence with Reliability and Importance for Decision Making
The combination of sources of evidence with reliability has been widely studied within the framework of Dempster-Shafer theory (DST), which has been employed as a major method for integrating multiple sources of evidence with uncertainty.By the fact that sources of evidence may also be different in importance, for example inmulti-attribute decision making (MADM), we propose the importance discounting and combination method within the framework of DST to combine sources of evidence with importance, which is composed of an importance discounting operation and an extended Dempster’s rule of combination.
[106] vixra:1411.0425 [pdf]
D Numbers Theory: a Generalization of Dempster-Shafer Theory
Dempster-Shafer theory is widely applied to uncertainty modelling and knowledge reasoning due to its ability of expressing uncertain information. However, some conditions, such as exclusiveness hypothesis and completeness constraint, limit its development and application to a large extend. To overcome these shortcomings in Dempster-Shafer theory and enhance its capability of representing uncertain information, a novel theory called D numbers theory is systematically proposed in this paper.
[107] vixra:1411.0422 [pdf]
Generalized Evidence Theory
Conflict management is still an open issue in the application of Dempster Shafer evidence theory. A lot of works have been presented to address this issue. In this paper, a new theory, called as generalized evidence theory (GET), is proposed. Compared with existing methods, GET assumes that the general situation is in open world due to the uncertainty and incomplete knowledge.
[108] vixra:1411.0415 [pdf]
Performance of M-ary Soft Fusion Systems Using Simulated Human Responses
A major hurdle in the development of soft and hard/soft data fusion systems is the inability to determine the practical performance gains between fusion operators without the burdens associated with human testing. Drift diffusion models of human responses (i.e., decision, confidence assessments, and response times) from cognitive psychology can be used to gain a sense of the performance of a fusion system during the design phase without the need for human testing.
[109] vixra:1411.0414 [pdf]
Performance of Probability Transformations Using Simulated Human Opinions
Probability transformations provide a method of relating Dempster-Shafer sources of evidence to subjective probability assignments. These transforms are constructed to facilitate decision making over a set of mutually exclusive hypotheses.
[110] vixra:1411.0358 [pdf]
A Note on Computing Lower and Upper Bounds of Subjective Probability from Masses of Belief
This short note shows on a very simple example the consistency of free DSm model encountered in Dezert-Smarandache Theory (DSmT) [2] with a refined model for computing lower and upper probability bounds from basic belief assignments (bba). The belief functions have been introduced in 1976 by Shafer in Dempster-Shafer Theory (DST), see [1] and [2] for definitions and examples.
[111] vixra:1411.0335 [pdf]
Neutrosophic Crisp Sets & Neutrosophic Crisp Topological Spaces
In this paper, we generalize the crisp topological space to the notion of neutrosophic crisp topological space, and we construct the basic concepts of the neutrosophic crisp topology. In addition to these, we introduce the denitions of neutrosophic crisp continuous function and neutrosophic crisp compact spaces. Finally, some characterizations concerning neutrosophic crisp compact spaces are presented and one obtains several properties. Possible application to GIS topology rules are touched upon.
[112] vixra:1411.0332 [pdf]
Neutrosophic Multi Relations and Their Properties
In this paper, the neutrrosophic multi relation (NMR) defined on the neutrosophic multisets [18] is introduced. Various properties like re°exiv- ity,symmetry and transitivity are studied.
[113] vixra:1411.0329 [pdf]
Neutrosophic Rened Relations and Their Properties
In this paper, the neutrosophic rened relation (NRR) dened on the neutrosophic rened sets( multisets) [13] is introduced. Various properties like refexivity, symmetry and transitivity are studied.
[114] vixra:1411.0319 [pdf]
Rough Neutrosophic Sets
Both neutrosophic sets theory and rough sets theory are emerging as powerful tool for managing uncertainty, indeterminate, incomplete and imprecise information.In this paper we develop an hybrid structure called rough neutrosophic sets and studied their properties.
[115] vixra:1411.0316 [pdf]
Soft Neutrosophic Loop, Soft Neutrosophic Biloop and Soft Neutrosophic N-Loop
Soft set theory is a general mathematical tool for dealing with uncertain, fuzzy, not clearly de…ned objects. In this paper we introduced soft neutrosophic loop,soft neutosophic biloop, soft neutrosophic N-loop with the discuission of some of their characteristics. We also introduced a new type of soft neutrophic loop, the so called soft strong neutrosophic loop which is of pure neutrosophic character. This notion also foound in all the other corresponding notions of soft neutrosophic thoery. We also given some of their properties of this newly born soft structure related to the strong part of neutrosophic theory.
[116] vixra:1411.0300 [pdf]
A New Wave Quantum Relativistic Equation from Quaternionic Representation of Maxwell-Dirac Isomorphism as an Alternative to Barut-Dirac Equation
It is known that Barut's equation could predict lepton and hadron mass with remarkable precision. Recently some authors have extended this equation, resulting in Barut-Dirac equation. In the present article we argue that it is possible to derive a new wave equation as alternative to Barut-Dirac's equation from the known exact correspondence (isomorphism) between Dirac equation and Maxwell electromagnetic equations via biquaternionic representation. Furthermore, in the present note we submit the viewpoint that it would be more conceivable if we interpret the vierbein of this equation in terms of super°uid velocity, which in turn brings us to the notion of topological electronic liquid. Some implications of this proposition include quantization of celestial systems. We also argue that it is possible to find some signatures of Bose-Einstein cosmology, which thus far is not explored su±ciently in the literature. Further experimental observation to verify or refute this proposition is recommended.
[117] vixra:1411.0235 [pdf]
Application of New Absolute and Relative Conditioning Rules in Threat Assessment
This paper presents new absolute and relative conditioning rules as possible solution of multi-level conditioning in threat assessment problem. An example of application of these rules with respect to target observation threat model has been provided. The paper also presents useful directions in order to manage the implemented multiple rules of conditioning in the real system.
[118] vixra:1411.0233 [pdf]
Computational-Communicative Actions of Informational Processing
This study is circumscribed to the Information Science. The zetetic aim of research is double: a) to dene the concept of action of informational processing and b) to design a taxonomy of actions of informational processing.
[119] vixra:1408.0024 [pdf]
Dimension of Physical Space
Each vector of state has its own corresponing element of the Cayley–Dickson algebra. Properties of a state vector require that this algebra was a normalized division algebra. By the Hurwitz and Frobenius theorems maximal dimension of such algebra is 8. Consequently, a dimension of corresponding complex state vectors is 4, and a dimension of the Clifford set elements is 4x4. Such set contains 5 matrices - among them - 3 diagonal. Hence, a dimension of the dot events space is equal to 3+1.
[120] vixra:1402.0067 [pdf]
Measuring Complexity by Using Reduction to Solve P vs NP and NC & PH
This article prove that NC and PH is proper (especially P is not NP) by using reduction difference. We can prove that NC is proper by using AL0 is not NC. This means L is not P. We can prove P is not NP by using reduction difference between L and P. And we can also prove that PH is proper by using P is not NP.
[121] vixra:1401.0207 [pdf]
Approach to solve P vs PSPACE with Collapse of Logarithmic and Polynomial Space
This article describes about that P is not PSPACE. If P is PSPACE, we can derive P is L from relation of logarithm and polynomial reduction. But this result contracit Space Hierarchy Theorem. Therefore P is not PSPACE.
[122] vixra:1401.0179 [pdf]
Symmetry as Turing Machine - Approach to solve P vs NP
This article describes about that P is not NP by using difference of symmetry. Turing Machine (TM) change configuration by using transition functions. This changing keep halting configuration. That is, TM classify these configuration into equivalence class. The view of equivalence class, there are different between P and coNP. Some coNP problem have over polynomial size totally order inputs. These problem cannot reduce P because these totally order must keep. Therefore we cannot reduce some coNP problem to P problem. This means P is not NP.
[123] vixra:1312.0247 [pdf]
Approach to Solve P vs NP by Using Bijection Reduction
This article describes about that P is not NP by using bijection reduction between each problems. If injective reduction of each directions between CNFSAT and HornSAT exist, bijection between CNFSAT and HornSAT also exist. If P is NP, this bijection is polynomial time. But HornSAT description is polynomial complex and CNFSAT description is exponential complex. It means that there is no bijection in polynomial time. Therefore P is not NP.
[124] vixra:1307.0075 [pdf]
The Truth About Geometric Unity
In May of 2013 a pair of articles appeared on the Guardian newspaper website featuring a new candidate "theory of everything" called Geometric Unity. A cursory reading of each article gives the impression that Geometric Unity was developed by Eric Weinstein, but a closer reading reveals that Weinstein is not credited as such. The truth about Geometric Unity is that it was authored by this writer in several papers beginning in 2009. This article will describe the development and prominent features of the new theory.
[125] vixra:1304.0109 [pdf]
The Twilight of the Scientific Age
This brief article presents the introduction and draft of the fundamental ideas developed at length in the book of the same title, which gives a challenging point of view about science and its history/philosophy/sociology. Science is in decline. After centuries of great achievements, the exhaustion of new forms and fatigue have reached our culture in all of its manifestations including the pure sciences. Our society is saturated with knowledge which does not offer people any sense in their lives. There is a loss of ideals in the search for great truths and a shift towards an anodyne specialized industry.
[126] vixra:1303.0196 [pdf]
Scientific Errors and Ambiguities in Prominent Submissions to Canadian Environmental Assessments: A Case Study of the Jackpine Mine Expansion Project
In Canada, as in many other developed nations, natural resource development projects meeting certain criteria are required to undergo an environmental assessment (EA) process to determine potential human and ecological health impacts. As part of the Canadian EA process, the Canadian Environmental Assessment Agency generally considers submissions by members of the public and experts. While the allowance of external submissions during EA hearings forms an important component of a functional participatory democracy, little attention appears to have been given regarding the quality of such EA submissions. In particular, submissions to EA hearings by prominent individuals and/or groups may be weighted more heavily in the overall decision making framework than those from non-experts. Important questions arise through the allowance and consideration of external submissions to EAs, such as whether inaccuracies in any such submissions may misdirect the EA decision makers to reach erroneous conclusions, and if such inaccuracies do result in sub-optimal EA processes, how the issues should be addressed. In the current work, a representative recent external submission from a prominent public individual and group to the Shell Canada Jackpine Mine Expansion (JPME) Project EA hearings was examined. The case study submission to the JPME EA hearings appears to contain a number of significant scientific errors and/or ambiguities, demonstrating that the EA process in Canada appears to allow potentially flawed submissions from prominent individuals and/or groups, and these problematic submissions may result in unnecessary delays, expenses, or even erroneous decisions. From a public policy perspective, it is desirable that the Canadian EA process be reformed to minimize contributions that may not result in an accurate assessment of the underlying science for the project(s) under consideration.
[127] vixra:1302.0105 [pdf]
Is the Field of Numbers a Real Physical Field? On the Frequent Distribution and Masses of the Elementary Particles
Frequent distributions of the databases of the numerical values obtained by resolving algorithms, which describe physical and other processes, give a possibility for bonding the probability of that results the algorithms get. In the frequent distribution of the fractions of integers (rational numbers), local maxima which meet the ratios of masses of the elementary particles have been found.
[128] vixra:1301.0073 [pdf]
The Decline of Global Per Capita Renewable Internal Freshwater Resources
Supplies of per capita renewable internal freshwater resources are declining at alarming rates around the globe, necessitating efforts to better manage population growth and the use and distribution of freshwater. All major geographic regions saw substantial reductions in per capita renewable internal freshwater supplies between 1962 and 2011. Over this period, the global per capita freshwater stock declined by 54%, with decreases of 75% in Sub-Saharan Africa, 71% in the Middle East and North Africa, 64% in South Asia, 61% in Latin America and the Caribbean, 52% in East Asia and the Pacific, and 41% in North America. At current rates of depletion, global per capita renewable internal freshwater resources are projected to decline by 65% compared to 1962 values before stabilizing, having regional variation ranging from 60% in East Asia and the Pacific to 86% of the Middle East and North Africa. Sub-Saharan Africa is predicted to reach a negative per capita renewable internal freshwater balance by the year 2120. Per capita renewable internal freshwater resources are declining more rapidly in low income countries than their middle and high income counterparts. All countries except Hungary and Bulgaria experienced declines in their per capita renewable internal freshwater supply between 1962 and 2011. Most countries (55%) experienced a decline of between 60% to 80% in per capita renewable internal freshwater resources over this period. The majority of nations are projected to maintain positive per capita renewable internal freshwater balances under steady-state conditions, although overall declines of between 80% to almost 100% from 1962 levels are dominant (~52% of all countries). A group of 28 nations is projected to reach zero per capita internal freshwater resources within the near future. African countries dominate the list of nations projected to reach zero per capita internal freshwater resources, comprising 16 of the 28 countries - of which six are landlocked. A further group of 25 nations have data records that are too short, and recent population dynamics that are generally too complex, for reliable trend extrapolation. Close attention will need to be paid to the per capita renewable internal freshwater resource trends for these countries over the coming decades in order to obtain a better understanding of their resource depletion rates.
[129] vixra:1212.0002 [pdf]
Finding the Fine Structure of the Solutions of Complicate Logical Probabilistic Problems by the Frequent Distributions
The Author suggests that frequent distributions can be applied to the modelling the influences of stochastically perturbing factors onto physical processes and situations, in order to look for most probable numerical values of the parameters of the complicate systems. In this deal, very visual spectra of the particularly undetermined complex problems have been obtained. These spectra allows to predict the probabilistic behaviour of the system.
[130] vixra:1211.0036 [pdf]
John von Neumann and Self-Reference ...
It is shown that the description as a "frog" of John von Neumann in a recent item by the Princeton celebrity physicist Freeman Dyson does among others miss completely on the immesnely important revolution of the so called "von Neumann architecture" of our modern electronic digital computers.
[131] vixra:1211.0026 [pdf]
A Scienceographic Comparison of Physics Papers from the arXiv and viXra Archives
arXiv is an e-print repository of papers in physics, computer science, and biology, amongst others. viXra is a newer repository of e-prints on similar topics. Scienceography is the study of the writing of science. In this work we perform a scienceographic comparison of a selection of papers from the physics section of each archive. We provide the first study of the viXra archive and describe key differences on how science is written by these communities.
[132] vixra:1209.0059 [pdf]
Towards a Unified Model of Outdoor and Indoor Spaces
Geographic information systems traditionally dealt with only outdoor spaces. In recent years, indoor spatial information systems have started to attract attention partly due to the increasing use of receptor devices (e.g., RFID readers or wireless sensor networks) in both outdoor and indoor spaces. Applications that employ these devices are expected to span uniformly and supply seamless functionality in both outdoor and indoor spaces. What makes this impossible is the current absence of a unified account of these two types of spaces both in terms of modeling and reasoning about the models. This paper presents a unified model of outdoor and indoor spaces and receptor deployments in these spaces. The model is expressive, flexible, and invariant to the segmentation of a space plan, and the receptor deployment policy. It is focused on partially constrained outdoor and indoor motion, and it aims at underlying the construction of future, powerful reasoning applications.
[133] vixra:1207.0001 [pdf]
Chalmers Science School for International and Swedish Students
This paper will describe the implementation of competitively based and highly structured scientific programs in a framework of a science school at Chalmers University of Technology. We discuss the implementation, advantages and disadvantages of those programs, the requirements students and supervisors should fulfill, whether from the academia or from the industry, and we present the selection method of participants. We also reflect on the results of a survey conducted recently among Chalmers academic staff. We believe that the installation of this science school at Chalmers brings many advantages to students, starting with a better understanding of industry practices and ending with an easier path to recruitment. It further helps employers in efficiently administering the process of hiring students and in discovering technological breakthroughs. Moreover, it enables the university to establish better connections with the industry and later use its feedback to enhance academic courses and the content of those courses. Our method derives from the successful practices of a pioneering science school at the Israeli Weizmann Institute of Science; namely the Kupcinet-Getz Science School for Israeli and International Students. The method further acquires practices from published literature of relevance to our discussion. We aspire Chalmers Science School to be a blueprint for any emerging or evolving science school at any educational institute worldwide.
[134] vixra:1202.0094 [pdf]
On Leveraging the Chaotic and Combinatorial Nature of Deterministic N-Body Dynamics on the Unit M-Sphere in Order to Implement a Pseudo-Random Number Generator
The goal of this paper is to describe how to implement a pseudo-random number generator by using deterministic n-body dynamics on the unit m-sphere. Throughout this paper we identify several types of patterns in dynamics, along with ways to interrupt the formation of these patterns.
[135] vixra:1107.0054 [pdf]
Introductory Questions About Symbolic Values
Many distinct concepts including data, memes, and information are introduced. This manuscript aims to highlight the important role that unconscious physical reality plays in the creation and transmission of symbolic values.
[136] vixra:1004.0065 [pdf]
S-Denying a Theory
In this paper we introduce the operators of validation and invalidation of a proposition, and we extend the operator of S-denying a proposition, or an axiomatic system, from the geometric space to respectively any theory in any domain of knowledge, and show six examples in geometry, in mathematical analysis, and in topology.