General Center and Bisector Theories in Fundamental Sciences of Estimation, Approximation, Data Modeling and Processing, and Solving General Problems

by

© Ph. D. & Dr. Sc. Lev Gelimson

Academic Institute for Creating Fundamental Sciences (Munich, Germany)

Mechanical and Physical Journal

of the "Collegium" All World Academy of Sciences

Munich (Germany)

11 (2011), 5

To contradictory (e.g. overdetermined) problems consisting of sets of given relations with some unknowns, there are no precise solutions. It is necessary to relatively simply find such values of these unknowns that all these relations are approximately satisfied with possibly small deviations in certain reasonable sense. Such values are called quasisolutions to the corresponding problems.

In particular, by data modeling, processing, estimation, and approximation [1], data scatter is relatively great in many cases and often allows no discriminating different analytic approximation expression types or forms, e.g. linear, piecewise linear, parabolic, hyperbolic, circumferential, elliptic, sinusoidal, etc. by two-dimensional data or linear, piecewise linear, paraboloidal, hyperboloidal, spherical, ellipsoidal, etc. by three-dimensional data. In such a situation, pure analytic approach alone is blind and often leads to false results. Without graphically interpreting the given data, it is almost impossible to discover relations between them and their laws to provide adequate data processing. For reasonably analytically approximating the given data, it is necessary and very useful to create conditions for efficiently applying analytic methods.

As ever, the fundamental principle of tolerable simplicity [2-7] plays a key role.

In overmathematics [2-7] and fundamental sciences of estimation [8-13], approximation [14, 15], data modeling [16] and processing [17], as well as solving general problems [18], general center and bisector theories naturally groupwise consider all the given data without any exceptions. These theories provide improvement of data modeling, processing, estimation, and approximation, as well as solving general problems, via preliminarily rationally locally groupwise centralizing and pairwise bisecting appropriate data quantisets [2-7] graphically represented with precisely considering data quantities and synergistically applying both graphical and analytic approaches to the given, intermediate, and final data.

Consider a group of data points quantielements, generally their quantiset. Denote the ith coordinate with xi (i = 1, 2, ... , m , m ∈ N+ = {1, 2, ...}, e.g. by m = 2, x1 for x-coordinate and x2 for y-coordinate), the ith coordinate of the jth data point quantielement [2-7] (j = 1, 2, ... , n , n ∈ N+ = {1, 2, ...}) of a group of data points quantielements with xij , and the quantity of the element in this quantielement with qj . Replace this group with its weighted central data point quantielement. The ith coordinate of the element of this quantielement is

xi = Σj=1n qjxjj / Σj=1n qj ,

and the quantity of the element in this quantielement is

q = Σj=1n qj .

To clearly graphically interpret the given three-dimensional data, it is very useful to provide their two-dimensional modeling via suitable data transformation if possible. For example, this is the case by strength data due to fundamental science of strength data unification, modeling, analysis, processing, approximation, and estimation [19, 20].

Apply graph-analytic theories [21], principal graph types theories [22], and groupwise centralization theories [23] in fundamental sciences of estimation, approximation, data modeling and processing to the given data via the following graph-analytic algorithm:

1. Graphically represent the initial or suitably transformed given data in a two-dimensional Cartesian coordinate system.

2. Determine and separate clear outlier candidates (whose number should not exceed entier(n/10) + 1, i.e. about 10 % of the total number n of the data points) via applying intuitive graphical and then (if necessary and useful) analytic criteria to the (initial or suitably transformed) given data. For example, a data point could be considered an outlier candidate if the radius of the greatest circular neighborhood of this point without the remaining given data points is at least three times greater than the mean arithmetic value of such radii of the entier(9n/10), i.e. about 90 % of the given data points with the least radii. Another possible criterion: For every data point, determine the sum of its distances from entier(n1/2) other data points nearest to this data point, compare these sums for all the data points, and separate at most entier(n/10) + 1 clear outlier candidates with the greatest sums, ideally all data points for which such sums are much greater than such sums for all the remaining (so-called clearly correct) data points.

3. Determine the least closed area containing all the clearly correct data points.

4. Consider all the outlier candidates, determine and separate clear outliers. Nota bene: An outlier candidate is no outlier if and only if its distance from the least closed area (see further) containing all the clearly correct data points either vanishes (when this candidate is an insider of this area) or is sufficiently small, e.g. not greater than the mean distance of the clearly correct data point with the greatest sum from entier (n1/2) other data points nearest to this data point.

5. Add all the outlier candidates which are no outliers to all the clearly correct data points and obtain all the already correct data points.

6. Determine the least closed area containing all the already correct data points.

7. Consider all the outliers, build such their appropriate combinations that the elements of the weighted central data point quantielements belong to this area, and select the best set of such combinations which provides most completely using all the outliers without repetitions. In such a way, usually it is possible to use some part of the outliers. Then add their weighted central data point quantielements to the already correct data points and obtain the so-called correct data points.

8. Using the fundamental principle of tolerable simplicity [2-7], intuitively graphically determine and represent a possibly simple probable approximation line (straight line or curve) bisector of all the correct data points with taking either rotation invariance or linear transformation invariance of the given data. Nota bene: This line can be now determined and represented very approximately, even roughly only and should be considered fuzzy, namely as one of the possible lines in a band with a certain width which can be variable along this line.

By solving a contradictory (e.g. overdetermined) general problem [2-7, 18] consisting of a quantiset of given quantirelations with some unknowns, apply graph-analytic theories [21], principal graph types theories [22], and groupwise centralization theories [23] in fundamental sciences of estimation, approximation, data modeling and processing to the given data via the following graph-analytic algorithm:

1. Graphically represent the initial or suitably transformed given quantirelations in a two-dimensional Cartesian coordinate system.

2. Determine (groupwise if necessary) possibly smallest areas between possibly many quantirelations graphs, the corresponding quantirelations being NOT satisfied in these areas, and all the quantirelations being considered.

3. Systematically select such appropriate pairs of these graphs or their parts that it is reasonable and possible to relatively reliably define and determine a straight line (or, generally, curvilinear) bisector (or al least its part) between each of these pairs with locally equally (in certain suitable sense) dividing the corresponding part of the corresponding area. Nota bene: There can be different systems of selecting these pairs. Consider each of these systems separately (and compare their results with one another), for example:

3.1. General approach. Consider each pair of the adjacent parts of the boundary of such an area. Start at the common point (node) of these parts and move into this area along the bisector of the internal angle between these parts.

These theories are very efficient in estimation, approximation, and data processing.

Acknowledgements to Anatolij Gelimson for our constructive discussions on coordinate system transformation invariances and his very useful remarks.

References

[1] Encyclopaedia of Mathematics. Ed. M. Hazewinkel. Volumes 1 to 10. Kluwer Academic Publ., Dordrecht, 1988-1994

[2] Lev Gelimson. Basic New Mathematics. Monograph. Drukar Publishers, Sumy, 1995

[3] Lev Gelimson. General Analytic Methods. Abhandlungen der WIGB (Wissenschaftlichen Gesellschaft zu Berlin), 3 (2003), Berlin

[4] Lev Gelimson. Elastic Mathematics. Abhandlungen der WIGB (Wissenschaftlichen Gesellschaft zu Berlin), 3 (2003), Berlin

[5] Lev Gelimson. Elastic Mathematics. General Strength Theory. Mathematical, Mechanical, Strength, Physical, and Engineering Monograph. The “Collegium” All World Academy of Sciences Publishers, Munich (Germany), 2004

[6] Lev Gelimson. Providing Helicopter Fatigue Strength: Flight Conditions [Overmathematics and Other Fundamental Mathematical Sciences]. In: Structural Integrity of Advanced Aircraft and Life Extension for Current Fleets – Lessons Learned in 50 Years After the Comet Accidents, Proceedings of the 23rd ICAF Symposium, Dalle Donne, C. (Ed.), 2005, Hamburg, Vol. II, 405-416

[7] Lev Gelimson. Overmathematics: Fundamental Principles, Theories, Methods, and Laws of Science. Mathematical Monograph. The “Collegium” All World Academy of Sciences Publishers, Munich (Germany), 2009

[8] Lev Gelimson. General estimation theory. Transactions of the Ukraine Glass Institute, 1 (1994), 214-221

[9] Lev Gelimson. General Estimation Theory. Mathematical Monograph. The “Collegium” All World Academy of Sciences Publishers, Munich (Germany), 2001

[10] Lev Gelimson. General Estimation Theory Fundamentals. Mathematical Journal of the “Collegium” All World Academy of Sciences, Munich (Germany), 1 (2001), 3

[11] Lev Gelimson. General Estimation Theory Fundamentals (along with its line by line translation into Japanese). Mathematical Journal of the “Collegium” All World Academy of Sciences, Munich (Germany), 9 (2009), 1

[12] Lev Gelimson. General Estimation Theory (along with its line by line translation into Japanese). Mathematical Monograph. The “Collegium” All World Academy of Sciences Publishers, Munich (Germany), 2011

[13] Lev Gelimson. Fundamental Science of Estimation. Mathematical Monograph. The “Collegium” All World Academy of Sciences Publishers, Munich (Germany), 2011

[14] Lev Gelimson. General Problem Theory. Abhandlungen der WIGB (Wissenschaftlichen Gesellschaft zu Berlin), 3 (2003), Berlin

[15] Lev Gelimson. Fundamental Science of Approximation. Mathematical Monograph. The “Collegium” All World Academy of Sciences Publishers, Munich (Germany), 2011

[16] Lev Gelimson. Fundamental Science of Data Modeling. Mathematical Monograph. The “Collegium” All World Academy of Sciences Publishers, Munich (Germany), 2011

[17] Lev Gelimson. Fundamental Science of Data Processing. Mathematical Monograph. The “Collegium” All World Academy of Sciences Publishers, Munich (Germany), 2011

[18] Lev Gelimson. Fundamental Science of Solving General Problems. Mathematical Monograph. The “Collegium” All World Academy of Sciences Publishers, Munich (Germany), 2011

[19] Lev Gelimson. Fundamental Science of Strength Data Unification, Modeling, Analysis, Processing, Approximation, and Estimation (Essential). Strength and Engineering Journal of the “Collegium” All World Academy of Sciences, Munich (Germany), 10 (2010), 3

[20] Lev Gelimson. Fundamental Science of Strength Data Unification, Modeling, Analysis, Processing, Approximation, and Estimation (Fundamentals). Strength Monograph. The “Collegium” All World Academy of Sciences Publishers, Munich (Germany), 2010

[21] Lev Gelimson. Graph-Analytic Theories in Fundamental Sciences of Estimation, Approximation, Data Modeling and Processing (Essential). Mathematical Journal of the “Collegium” All World Academy of Sciences, Munich (Germany), 11 (2011), 2

[22] Lev Gelimson. Principal Graph Types Theories in Fundamental Sciences of Estimation, Approximation, Data Modeling and Processing (Essential). Mathematical Journal of the “Collegium” All World Academy of Sciences, Munich (Germany), 11 (2011), 3

[23] Lev Gelimson. Groupwise Centralization Theories in Fundamental Sciences of Estimation, Approximation, Data Modeling and Processing (Essential). Mechanical and Physical Journal of the “Collegium” All World Academy of Sciences, Munich (Germany), 11 (2011), 1

[24] Lev Gelimson. Data, Problem, Method, and Result Invariance Theories in Fundamental Sciences of Estimation, Approximation, Data Modeling and Processing, and Solving General Problems (Essential). Mathematical Journal of the “Collegium” All World Academy of Sciences, Munich (Germany), 11 (2011), 1