Concatenation of Orthogonal Analytics and Geometrical Projections
Received: 01-Sep-2021 Accepted Date: Sep 08, 2021; Published: 13-Sep-2021
Citation: Rassias MT. Concatenation of Orthogonal analytics and geometrical projections. J Pur Appl Math. 2021; 5(5):57:57.
This open-access article is distributed under the terms of the Creative Commons Attribution Non-Commercial License (CC BY-NC) (http://creativecommons.org/licenses/by-nc/4.0/), which permits reuse, distribution and reproduction of the article, provided that the original work is properly cited and the reuse is restricted to noncommercial purposes. For commercial reuse, contact reprints@pulsus.com
Abstract
Projections are an important class of linear transformations (besides rotations and reflections) and play an important role in graphics, coding theory, statistics and machine learning. In machine learning, we often deal with data that is high-dimensional. High-dimensional data is often hard to analyze or visualize.
Introduction
In statistics, the meaning of orthogonal as unrelated (or more precisely uncorrelated) is very directly related to the mathematical definition. However, high-dimensional data quite often possesses the property that only a few dimensions contain most information, and most other dimensions are not essential to describe key properties of the data. When we compress or visualize high-dimensional data, we will lose information. To minimize this compression loss, we ideally find the most informative dimensions in the data. “Feature” is a common expression for data representation. Data can be represented as vectors. More specifically, we can project the original highdimensional data onto a lower-dimensional feature space and work in this lower-dimensional space to learn more about the dataset and extract relevant patterns. The projections π U(x) are still vectors in Rn although they lie in an m-dimensional subspace U ⊆ Rn. However, to represent a projected vector we only need the m coordinates λ1,...,λm with respect to the basis vectors b1,...,bm of U. In vector spaces with general inner products, we have to pay attention when computing angles and distances, which are defined by means of the inner product. We can find approximate solutions to unsolvable linear equation systems using projections. Projections allow us to look at situations where we have a linear system Ax = b without a solution. Recall that this means that b does not lie in the span of A, i.e., the vector b does not lie in the subspace spanned by the columns of A. Given that the linear equation cannot be solved exactly, we can find an approximate solution.
The idea is to find the vector in the subspace spanned by the columns of At hat is closest to b, i.e., we compute the orthogonal projection of b onto the subspace spanned by the columns of A. This problem arises often in practice, and the solution is called the least-squares solution (assuming the dot product as the inner product) of least-squares solution an over determined system.
While in chromatography, refers to random selection between separations. 2D separations are mandatory about addressing one of the major concerns in method development, insufficient resolution, which can mask. Additionally, the application of Orthogonal contrasts is an alternative way of doing statistical analysis on data from experiments without a definitive form, as the projects with incremental treatment.