Circumferential Theories in Fundamental Sciences of Estimation, Approximation, Data Modeling and Processing
by
© Ph. D. & Dr. Sc. Lev Gelimson
Academic Institute for Creating Fundamental Sciences (Munich, Germany)
Mathematical Journal
of the "Collegium" All World Academy of Sciences
Munich (Germany)
11 (2011), 6
By data modeling, processing, estimation, and approximation [1], data scatter is relatively great in many cases and often allows no discriminating different analytic approximation expression types or forms, e.g. linear, piecewise linear, parabolic, hyperbolic, circumferential, elliptic, sinusoidal, etc. by two-dimensional data or linear, piecewise linear, paraboloidal, hyperboloidal, spherical, ellipsoidal, etc. by three-dimensional data. In such a situation, pure analytic approach alone is blind and often leads to false results. Without graphically interpreting the given data, it is almost impossible to discover relations between them and their laws to provide adequate data processing. For reasonably analytically approximating the given data, it is necessary and very useful to create conditions for efficiently applying analytic methods.
As ever, the fundamental principle of tolerable simplicity [2-7] plays a key role.
In overmathematics [2-7] and fundamental sciences of estimation [8-13], approximation [14, 15], as well as data modeling [16] and processing [17], to clearly graphically interpret the given three-dimensional data, it is very useful to provide their two-dimensional modeling via suitable data transformation if possible. For example, this is the case by strength data due to fundamental science of strength data unification, modeling, analysis, processing, approximation, and estimation [18, 19].
Circumferential theories in fundamental sciences of estimation, approximation, data modeling and processing are applicable to given two-dimensional data for which there exists a certain probable approximation law. In particular, it is also possible to combine these theories with other theories transforming the given data and to use not only their end data but also their intermediate data. In particular, preliminarily apply graph-analytic theories [20], principal graph types theories [21], invariance theories [22], groupwise centralization theories [23], bounds mean theories [24], linear two-dimensional [25] and three-dimensional [26] theories of moments of inertia, as well as general theories of moments of inertia [27] in fundamental sciences of estimation, approximation, data modeling and processing to the given data.
Circumferential theories are completing (supplementing) additions to all these and other theories and consider all the given data points.
The ideas and essence of circumferential theories are as follows:
1. Determining the least closed area containing all the data points.
2. Very roughly purely graphically determining a certain probable approximation law with the corresponding probable approximation graph, namely line (straight line, broken straight line, or curve) in this two-dimensional case.
3. Dividing the area boundary graph into two subgraphs, namely above and below subgraphs (if this law graph is nonclosed) or outer and inner subgraphs (if this law graph is closed), with building the both extreme levels of the given data points.
4. Determining the type [21] of this law graph.
4.1. If this type is linear, namely a straight line in this two-dimensional case, then by the fundamental principle of tolerable simplicity [2-7], first apply namely linear theories, e.g. least squared distance theories [28, 29], least biquadratic method [30], and quadratic mean theories for two dimensions [31] in fundamental sciences of estimation, approximation, data modeling and processing to the given data points. If there are bounds and limitations which allow using certain predefined parts only, it leads to clear complications. This holds even in the simplest case of a straight line whose limited parts can be its intervals, half-closed intervals, and closed intervals, or segments, as well as more complicated parts. For example, if limited parts of a straight line do not contain the base of the perpendicular from a given data point onto a graph, then the graph point which is the nearest to a given data point can be not alone but multiple and it is also possible that there are no such nearest points at all, e.g. in the case when in any arbitrarily small neighborhood (vicinity) of the base of that perpendicular, there are points in admissible parts of graphs. The same can be valid not only by this linear graph type. Secondly, it is also possible to apply general central normalization theories [32] in fundamental sciences of estimation, approximation, data modeling and processing to the given data points in order to investigate whether theories nonlinearity can give essential advantages compared with linear theories.
4.2. If this type is piecewise linear, namely a broken straight line in this two-dimensional case, then by the fundamental principle of tolerable simplicity [2-7], first divide the given data into appropriate parts and consider them separately along with the corresponding parts of linear graphs.
4.3. If this type is quasilinear with equal curvature signs, namely a result of relatively slightly deforming (bending, twisting, distorting, or warping) parts of linear graphs (straight lines) to obtain arcs without deflection (changing curvature signs), then by the fundamental principle of tolerable simplicity [2-7], along with Cartesian coordinate systems with their transformations equalizing the generally different mean curvatures, using polar coordinate systems with either predefined (fixed) or variable poles can bring additional advantages. To select such poles, preliminarily consider probable centers of curvature and ranges of their variability.
4.4. If this type is quasilinear with piecewise equal curvature signs, namely combining limited parts of quasilinear graphs with equal curvature signs, then by the fundamental principle of tolerable simplicity [2-7], first divide the given data into appropriate parts and consider them separately along with the corresponding parts of quasilinear graphs with piecewise equal curvature signs.
4.5. If this type is closed quasilinear with equal curvature signs which contains, e.g., circumferences and ellipses in this two-dimensional case, then by the fundamental principle of tolerable simplicity [2-7], along with Cartesian coordinate systems with their transformations equalizing the sums of the second powers of the homonymous coordinates of the given data points, using polar coordinate systems can bring additional advantages, too. To begin with, select the given data center as a pole.
Consider such an initially introduced polar coordinate system.
By using a polar coordinate system, it is also possible to additionally introduce the Cartesian interpretation of this polar coordinate system, e.g. with a polar angle as an abscissa and a distance as an ordinate in this two-dimensional case.
In many practically important cases, these simplest graph types suffice for data modeling, processing, estimation, and approximation. Otherwise, additionally introduce more complicated graph types, e.g. quantigraph types containing quantigraphs belonging to the quantisets building quantialgebras in quantianalysis in overmathematics [2-7].
5. Introducing normalization transformation of all the points of the whole plane in this two-dimensional case. The essence of this normalization transformation is dividing each plane point distance from the selected pole by the distance of the nearest (to this point) intersection of the polar ray (straight half-line) with the probable approximation graph from the selected pole. Apply this normalization transformation both to all the given data points and to all the points of the probable approximation graph.
5.1. For each initial data point, consider the unique polar ray (straight half-line) containing this point and determine the set of all the intersections of this polar ray with the probable approximation graph.
5.1.1. If this set contains one element only, then consider the corresponding probable approximation graph point as the polar projection of this data point on this probable approximation graph. Further consider this data point and its polar projection on this line mutual one-to-one corresponding to one another.
5.1.2. Otherwise (if this set uniquantity [2-7] is greater than 1), among its elements, consider the set of the probable approximation graph points which are the nearest to this data point.
5.1.2.1. If this set clearly contains one element only, then consider the corresponding probable approximation graph point as the polar projection of this data point on this probable approximation graph. Further consider this data point and its polar projection on this line mutual one-to-one corresponding to one another.
5.1.2.2. Otherwise (if this set is fuzzy, namely there are two probable approximation graph points for which it is unclear which of them is nearer to this data point than the remaining probable approximation graph point), consider the set of these both probable approximation graph points which both are the nearest to this data point. The set uniquantity q = 2. Divide 1 (the data point quantity) by this uniquantity q = 2 and consider this data point as the quantiset [2-7] of q = 2 quantielements, each of them coinciding with this data point and has quantity 1/q = 1/2 . Then for each of these both polar projection points on this probable approximation graph, select precisely one data point quantielement and consider it and this projection mutual one-to-one corresponding to one another.
5.2. For each initial data point quantielement, divide its distance from the selected pole by the distance of the corresponding polar projection point from the same selected pole to obtain the normalized polar distance of the corresponding normalized data point quantielement.
5.3. Apply the same normalization transformation to the area boundary graph consisting of two subgraphs, namely above and below subgraphs (if this law graph is nonclosed) or outer and inner subgraphs (if this law graph is closed), with building the both extreme levels of the given data points, to obtain the corresponding normalized area boundary graph and the both normalized area boundary subgraphs.
5.4. Apply the same normalization transformation to this probable approximation graph itself to naturally obtain the unit circumference in this two-dimensional case. Its center coincides with the same selected pole. The radius of this unit circumference is 1. This unit circumference is the corresponding normalized probable approximation graph.
5.5. Apply the same normalization transformation to the initially introduced polar coordinate system to obtain the corresponding normalized polar coordinate system.
6. Interpreting both all the normalized data points quantielements and this normalized probable approximation graph in this normalized polar coordinate system.
7. Applying circumferential theories themselves both to all the normalized data points quantielements and to this normalized probable approximation graph in this normalized polar coordinate system.
8. Determining a normalized graph which can be considered a mean (middle) normalized graph in this normalized polar coordinate system between the both normalized area boundary subgraphs in a certain reasonable sense with taking all the normalized data and valid data invariance type into account. By rotation invariance, use general theories of moments of inertia [25] in fundamental sciences of estimation, approximation, data modeling and processing.
9. Directly considering this mean (middle) normalized graph to be namely a normalized graph of this desired approximation law.
10. Applying the inversion of this normalization transformation to this mean (middle) normalized graph in this normalized polar coordinate system for determining a graph which can be considered a mean (middle) graph in the initially introduced polar coordinate system between the both area boundary subgraphs in a certain reasonable sense with taking all the normalized data and valid data invariance type into account. By rotation invariance, use general theories of moments of inertia [25] in fundamental sciences of estimation, approximation, data modeling and processing.
11. Directly considering this mean (middle) graph in the initially introduced polar coordinate system to be namely a graph of this desired approximation law.
Given n (n ∈ N+ = {1, 2, ...}, n > 2) data points [j=1n (xj , yj)] = {(x1 , y1), (x2 , y2), ... , (xn , yn)] with any real coordinates in the initial Cartesian two-dimensional coordinate system Oxy .
Determine the polar radius rj and the polar angle φj of the jth data point in the initial polar two-dimensional coordinate system with Orφ .
Apply the normalization transformation and obtain the polar radius r°j and the polar angle φ°j of the jth normalized data point in the normalized polar two-dimensional coordinate system O°r°φ° .
Now determine the normalized polar radius
r° = Σj=1n r°j / n
of a normalized circumference.
Directly consider this normalized circumference as a mean (middle) normalized graph of this desired approximation law.
Apply the inversion of this normalization transformation to this mean (middle) normalized graph in this normalized polar coordinate system for determining a graph which can be considered a mean (middle) graph in the initially introduced polar coordinate system.
Directly consider this mean (middle) graph in the initially introduced polar coordinate system to be namely a graph of this desired approximation law.
Circumferential theories consider all the given data points and provide relatively simply approximating all the given data.
To improve data modeling, processing, estimation, and approximation, it is also possible to preliminarily locally represent each data point group with its center whose quantity equals the number of the points in this group and then applying both graphical and analytic approaches to the already groupwise centralized data, namely to a quantiset [2-7] of their local groupwise centers.
Consider a group of data points quantielements, generally their quantiset. Denote the ith coordinate with xi (i = 1, 2, ... , m , m ∈ N+ = {1, 2, ...}, e.g. by m = 2, x1 for x-coordinate and x2 for y-coordinate), the ith coordinate of the jth data point quantielement [2-7] (j = 1, 2, ... , n , n ∈ N+ = {1, 2, ...}) of a group of data points quantielements with xij , and the quantity of the element in this quantielement with qj . Replace this group with its weighted central data point quantielement. The ith coordinate of the element of this quantielement is
xi = Σj=1n qjxjj / Σj=1n qj ,
and the quantity of the element in this quantielement is
q = Σj=1n qj .
The variety of circumferential theories and their variability provide their algorithms flexibility.
These theories are very efficient in estimation, approximation, data modeling and processing.
Acknowledgements to Anatolij Gelimson for our constructive discussions on coordinate system transformation invariances and his very useful remarks.
References
[1] Encyclopaedia of Mathematics. Ed. M. Hazewinkel. Volumes 1 to 10. Kluwer Academic Publ., Dordrecht, 1988-1994
[2] Lev Gelimson. Basic New Mathematics. Monograph. Drukar Publishers, Sumy, 1995
[3] Lev Gelimson. General Analytic Methods. Abhandlungen der WIGB (Wissenschaftlichen Gesellschaft zu Berlin), 3 (2003), Berlin
[4] Lev Gelimson. Elastic Mathematics. Abhandlungen der WIGB (Wissenschaftlichen Gesellschaft zu Berlin), 3 (2003), Berlin
[5] Lev Gelimson. Elastic Mathematics. General Strength Theory. Mathematical, Mechanical, Strength, Physical, and Engineering Monograph. The “Collegium” All World Academy of Sciences Publishers, Munich (Germany), 2004
[6] Lev Gelimson. Providing Helicopter Fatigue Strength: Flight Conditions [Overmathematics and Other Fundamental Mathematical Sciences]. In: Structural Integrity of Advanced Aircraft and Life Extension for Current Fleets – Lessons Learned in 50 Years After the Comet Accidents, Proceedings of the 23rd ICAF Symposium, Dalle Donne, C. (Ed.), 2005, Hamburg, Vol. II, 405-416
[7] Lev Gelimson. Overmathematics: Fundamental Principles, Theories, Methods, and Laws of Science. Mathematical Monograph. The “Collegium” All World Academy of Sciences Publishers, Munich (Germany), 2009
[8] Lev Gelimson. General estimation theory. Transactions of the Ukraine Glass Institute, 1 (1994), 214-221
[9] Lev Gelimson. General Estimation Theory. Mathematical Monograph. The “Collegium” All World Academy of Sciences Publishers, Munich (Germany), 2001
[10] Lev Gelimson. General Estimation Theory Fundamentals. Mathematical Journal of the “Collegium” All World Academy of Sciences, Munich (Germany), 1 (2001), 3
[11] Lev Gelimson. General Estimation Theory Fundamentals (along with its line by line translation into Japanese). Mathematical Journal of the “Collegium” All World Academy of Sciences, Munich (Germany), 9 (2009), 1
[12] Lev Gelimson. General Estimation Theory (along with its line by line translation into Japanese). Mathematical Monograph. The “Collegium” All World Academy of Sciences Publishers, Munich (Germany), 2011
[13] Lev Gelimson. Fundamental Science of Estimation. Mathematical Monograph. The “Collegium” All World Academy of Sciences Publishers, Munich (Germany), 2011
[14] Lev Gelimson. General Problem Theory. Abhandlungen der WIGB (Wissenschaftlichen Gesellschaft zu Berlin), 3 (2003), Berlin
[15] Lev Gelimson. Fundamental Science of Approximation. Mathematical Monograph. The “Collegium” All World Academy of Sciences Publishers, Munich (Germany), 2011
[16] Lev Gelimson. Fundamental Science of Data Modeling. Mathematical Monograph. The “Collegium” All World Academy of Sciences Publishers, Munich (Germany), 2011
[17] Lev Gelimson. Fundamental Science of Data Processing. Mathematical Monograph. The “Collegium” All World Academy of Sciences Publishers, Munich (Germany), 2011
[18] Lev Gelimson. Fundamental Science of Strength Data Unification, Modeling, Analysis, Processing, Approximation, and Estimation (Essential). Strength and Engineering Journal of the “Collegium” All World Academy of Sciences, Munich (Germany), 10 (2010), 3
[19] Lev Gelimson. Fundamental Science of Strength Data Unification, Modeling, Analysis, Processing, Approximation, and Estimation (Fundamentals). Strength Monograph. The “Collegium” All World Academy of Sciences Publishers, Munich (Germany), 2010
[20] Lev Gelimson. Graph-Analytic Theories in Fundamental Sciences of Estimation, Approximation, Data Modeling and Processing (Essential). Mathematical Journal of the “Collegium” All World Academy of Sciences, Munich (Germany), 11 (2011), 2
[21] Lev Gelimson. Principal Graph Types Theories in Fundamental Sciences of Estimation, Approximation, Data Modeling and Processing (Essential). Mathematical Journal of the “Collegium” All World Academy of Sciences, Munich (Germany), 11 (2011), 3
[22] Lev Gelimson. Data, Problem, Method, and Result Invariance Theories in Fundamental Sciences of Estimation, Approximation, Data Modeling and Processing, and Solving General Problems (Essential). Mathematical Journal of the “Collegium” All World Academy of Sciences, Munich (Germany), 11 (2011), 1
[23] Lev Gelimson. Groupwise Centralization Theories in Fundamental Sciences of Estimation, Approximation, Data Modeling and Processing (Essential). Mechanical and Physical Journal of the “Collegium” All World Academy of Sciences, Munich (Germany), 11 (2011), 1
[24] Lev Gelimson. Bounds Mean Theories in Fundamental Sciences of Estimation, Approximation, Data Modeling and Processing (Essential). Mathematical Journal of the “Collegium” All World Academy of Sciences, Munich (Germany), 11 (2011), 4
[25] Lev Gelimson. Linear Two-Dimensional Theories of Moments of Inertia in Fundamental Sciences of Estimation, Approximation, and Data Processing (Essential). Mechanical and Physical Journal of the “Collegium” All World Academy of Sciences, Munich (Germany), 10 (2010), 1
[26] Lev Gelimson. Linear Three-Dimensional Theories of Moments of Inertia in Fundamental Sciences of Estimation, Approximation, and Data Processing (Essential). Mechanical and Physical Journal of the “Collegium” All World Academy of Sciences, Munich (Germany), 10 (2010), 2
[27]Lev Gelimson. General Theories of Moments of Inertia in Fundamental Sciences of Estimation, Approximation, Data Modeling and Processing (Essential). Mechanical and Physical Journal of the “Collegium” All World Academy of Sciences, Munich (Germany), 10 (2010), 3
[28] Lev Gelimson. Least Squared Distance Theory in Fundamental Science of Solving General Problems (Essential). Mathematical Journal of the “Collegium” All World Academy of Sciences, Munich (Germany), 10 (2010), 1
[29] Lev Gelimson. Least Squared Distance Theories in Fundamental Sciences of Estimation, Approximation, and Data Processing (Essential). Mathematical Journal of the “Collegium” All World Academy of Sciences, Munich (Germany), 10 (2010), 2
[30] Lev Gelimson. Least Biquadratic Method in Fundamental Sciences of Estimation, Approximation, and Data Processing (Essential). Mathematical Journal of the “Collegium” All World Academy of Sciences, Munich (Germany), 10 (2010), 3
[31] Lev Gelimson. Quadratic Mean Theories for Two Dimensions in Fundamental Sciences of Approximation and Data Processing (Essential). Mathematical Journal of the “Collegium” All World Academy of Sciences, Munich (Germany), 10 (2010), 4
[32] Lev Gelimson. General Central Normalization Theories in Fundamental Sciences of Estimation, Approximation, Data Modeling and Processing (Essential). Mechanical and Physical Journal of the “Collegium” All World Academy of Sciences, Munich (Germany), 11 (2011), 2