StudentShare
Contact Us
Sign In / Sign Up for FREE
Search
Go to advanced search...
Free

Locality Preserving Projection by He and Niyogi - Article Example

Cite this document
Summary
This paper "Locality Preserving Projection by He and Niyogi" tells that as technological advances continue to emerge, a few proposals have been proposed regarding manifold learning algorithms like the Laplacian Eigenmap, Locally Linear Embedding, ISOMAP, and Locality Preserving Projection…
Download full paper File format: .doc, available for editing
GRAB THE BEST PAPER97.2% of users find it useful

Extract of sample "Locality Preserving Projection by He and Niyogi"

Introduction In the 21st century, as technological advances continues to emerge, a few proposals have been proposed regarding manifold learning algorithms like the Laplacian Eigenmap (LE), Locally Linear Embedding (LLE), ISOMAP and Locality Preserving Projection (LPP). These algorithms main aim is to discover the meaningful data space low dimensional structure. The aim of this report is to explore and analyse the Locality Preserving Projection proposed by He, X. and Niyogi, P (2004) from the University of Chicago in in their article namely; Locality preserving projections. The analysis of the article will be based on the content of the article, novelty (Innovation), technical quality rating, x-factor, quality of the presentation and research work application. Content Title In terms of the content and the title, the title of the article provides a full overview of what the research article is all about. However, the topic is broad in the sense that it does not specify what the title objective is. Upon reading the article, a reader realizes that the title Locality Preserving Projections is a solution to an existing problem in dimensionality reduction. A more appropriate title would have been more straight-forward in providing a detailed summary of the article in one sentence such as, Locality Preserving Projections in dimensionality reduction. Such a title is appropriate for this article because of its high quality work. Context The authors introduces a regression method of Locality Preserving Projections (LPP) and defines it as linear projective maps arising from solving variational problems that preserves the data sets neighbourhood structure optimally as an algorithm for linear dimensionality reduction method in information processing. This algorithm is introduced as a solution to a problem statement of dimensionality reduction where the authors provides an example of a situation where there is a collection of dimensional real vectors data points taken from a probability distribution that is unknown. The problem statement According to the authors, the LPP in this case creates a graph that incorporates the data sets neighbourhood information after which employing the graphs Laplacian notion, an individual then computes a transformational matrix that maps to a subspace, the data points. The transformation which is linear maintains the local neighbourhood information optimally in a positive sense. The generated representation map by the algorithm can be seen as a linear discrete approximation that arises naturally from the manifold geometry to a map that is continuous. The authors present a problem statement of linear dimensionality reduction where provided a set of one is needed to find a transformational matrix A that will map the m points to points so that “represents” where = The authors uses a practical applicability method where the case is special in that and in this case is a nonlinear manifold that is embedded in . Locality Preserving Projections Implementation According to the authors there are three steps followed when solving the problem statement using the LPP algorithm where they provide the steps which starts with the construction of the adjacency graph. In this step, indicates a graph with nodes. An edge is then put between the nodes and in case and are close. However there are two variations of -neighborhoods where [parameter]. and nodes are linked if ║by an edge where the common Euclidean norm is the norm and nearest neighbors where [parameter].and nodes are linked if is one of nearest neighbors of is one of nearest neighbours of . The second step involves weights choosing where is a symmetric sparse matrix where has the edge joining vertices weight and 0 if no such edge exists. The two variations in this step are Heat kernel where [parameter]. If and nodes are linked, put and Simple-minded where [No parameter]. = 1 if and only if an edge links and vertices. The third step involves Eigenmaps where the computation of eigenvalues and eigenvectors is carried out for the eigenvector generalized problem.λ. Justification of the LPP algorithm In order to justify their proposed LPP algorithm, the authors have provided three methods to justify and prove the functioning of the algorithm through various methods namely: Optimal Linear Embedding, Geometrical Justification and Kernel LPP. In Optimal Linear Embedding, the authors base their argument on the standard spectral graph theory. In reference to a given data set where a weighted graph = (), the authors give an example of a problem of trying to map to a line, the weighted graph so that the linked points remains close to each other as possible. They denote such a map using = () ᵀ. According to them a criterion that can be considered reasonable for selecting a better map would be to minimize the objective function under constraints that are suitable. This objective function with the authors selection of encounters a heavy penalty if and the neighboring points, are planned far apart. Thus, minimizing the above objective function according to the authors will be an attempt to make sure that if and are near, then and are near as well. The authors then continues to provide a worked out example of an objective function and how it is reduced to a minimization problem after imposing the constraints. In Geometrical Justification, the authors explain that the finite graph Laplacian matrix (= ) or Chung’s (1997), Spectral Graph Theory is similar to ℒ (Laplace Beltrami operator) on Riemannian compact manifolds. According to them, while the manifold Laplace Beltrami operator is produced by Riemannian metric, in the case of a graph it originates from adjacency relation. The authors assigns to be the d-dimensional, smooth, Riemannian compact manifold. In case the manifold gets embedded in the manifolds Riemannian structure is stimulated by the standard Riemannian structure. According to them, they are searching the manifold for a map to the actual line such that the points on the manifold that are near each other gets mapped near each other on the line. The authors takes to represent a map like that and surmises that is differentiatable twice. The authors then quotes a worked out optimization problem by Belkin and Niyogi (2002, 585). In the Kernel LPP, the authors makes an assumption of a Euclidean space that is represented in a Hilbert space through a mapping function that is nonlinear Ф: The authors use this function, the kernel function and a simple algebra formulation to formulate a Hilbert space eigenvector problem. The authors then employs the Kernel LPP to solve the problem and justify that it provides the same results as the Laplacian Eigenmaps. Data analytics methods In the course of carrying out their research, the authors employed various data analytics techniques such as association rule learning where they conducted an experiment of 2-dimensional data visualization with multiple features database utility maps of Dutch to detect the relationship between dimensionality reduction algorithms (Laplacian Eigenmaps, LPP and PCA). The data set of the Multiple Features Database consists of (`0'-`9') handwritten digits. The 2-dimensional space mapping of (`0'-`9') handwritten digits is done using Laplacian Eigenmaps, LPP and PCA algorithms. In this experiment the authors also used the clustering method to find the algorithm that is more suitable in dimensionality reduction which in this case they were trying to prove that the LPP algorithm is the best. Clusterization was also used in mapping the data points into 2-dimensional space. The authors also used summarization to provide a more data set compact representation in manifold of face images manifold where they applied LPP to face images for visualization where the dataset consisted of 1965 face images. The authors also used clusterization where the image faces were to be represented next to their data points in the space. The authors also conducted another experiment for face recognition where the authors used association rule learning, anomaly detection and clustering methods. The authors used the Yale face database to conduct this experiment. The authors applied the LPP algorithm for face recognition. The association rule was used to identify the relationships between the variables which include facial expressions, lighting conditions, and with or without glasses. Clusterization was used for cropping the final images, aligning the eyes on similar position and for the best algorithm selection. Anomaly detection was used to identify the error rates. In order to form the training set, six individual images were capture with labels. This training set according to the authors was employed to learn projections. Projection of testing samples was then done into the testing reduced space. Recognition was then carried out with the use of nearest neighbour classifier. Experimental results In this section the authors provide a discussion of the different application areas of LPP algorithm where they give two synthetic examples to provide an illustration of the way the LPP functions. The first example is simply synthetic where the authors provide figure 1 with four plots where the first plot and the third plot displays the PCA results while the second and forth indicates LPP results. From the authors view based on the example the LPP is insensitive and is more powerful in terms of discrimination than PCA. The authors provides a second figure 2 which represents a 2-dimensional space mapping of (`0'-`9') handwritten digits. The figure on the left in this case represents the Laplacian eigenmaps of all the images set of digits. The centre figure displays LPP results while the figure on the right displays PCA results. Every different colour in the figures is equivalent to a digit. In the face images Manifold experiment, the authors provide a figure 3 that shows the face images mapped into a 2-dimensional plane. According to the authors the mapping of image space to a space that is low-dimensional using the LPP method is linear. Various representative faces are displayed in the space at different parts subsequent to data points. To some length, the proposed LPP do not detect the face images nonlinear manifold structure. The face images in the space are segmented clearly to two parts with the right part displaying open mouth faces and the left part showing closed mouth faces. According to the authors, this is caused by the LPP algorithm where it implicitly stresses the data natural clusters when attempting to preserve the embedding neighbourhood structure. In the face recognition experiment, the authors provides a table 1 that displays the Yale database face recognition results indicating the error rates. Overall, PDA, LPP and PCA performance varies depending on the number of dimensions. The authors concluded that LPP performs better that the LDA and PCA. Further research implications According to the authors, their proposed linear Locality Preserving Projections algorithm for dimensionality reduction is based on a similar variational principle that is responsible for giving rise of the Laplacian Eigenmap. Therefore, it has the same properties for locality preservation. According to them, their algorithm has a main benefit over other up-to-date nonparametric methods for universal nonlinear dimensionality reduction in that its Eigen problem has dimensions that scales as the data points dimensionality to a certain extent than the data points number. For bulky data sets the memory and run time saving can be massive. Over Principal Component Analysis according to the authors this methods performance is illustrated through various experiments. Even if their method is an algorithm that is linear, it has the ability to discover the data manifolds nonlinear structure. Innovation Innovation can be explained as the act of introducing something new or simply the act of innovating. He, X. and Niyogi, P (2004) research paper can be identified as highly innovative because their work involves the proposition of a new linear algorithm that can be used for dimensionality reduction. He, X. and Niyogi, P (2004) simply made use of existing theories and methods to formulate a new and improved method for solving dimensionality reduction problem. Another reason as to why He, X. and Niyogi, P (2004) work is highly innovative is the fact that the new algorithm proposed by them is not just another common problem solving algorithm but an algorithm that can be applied in the state of art technology such as facial recognition technology which has been and is still been implemented by many companies, governments and individuals for security measures and surveillance. Even though this technology is not new and have been implemented using other algorithms such as PCA and PDA, LPP provides better functionality, detects errors, shares a lot of nonlinear techniques for data representation properties like the Laplacian Eigenmap, it is more tractable computationally and it outperforms the existing algorithms used in dimensionality reduction. The authors view the LPP algorithm as innovative from different viewpoints like: the LPP algorithm is linear and thus it’s suitable and fast for practical applications; the LPP algorithm can be applied in all data pints that are new because it is defined everywhere; LPP can also be carried out in reproducing Kernel Hilbert space (RKHS) or original space where the data points have been mapped and lastly the LPP maps are planned to minimize a distinct objective criterion out of the linear classical techniques. Because of these features, the authors expect techniques based on LPP to be viewed as a Principal Component Analysis (PCA) alternative in pattern classification, exploratory data analysis and information retrieval. The paper is also highly innovative because it has been cited by some other books and articles even though not by many such as: Zheng, F., Shao, L. and Song, Z., 2010, July. Eigen-space learning using semi-supervised diffusion maps for human action recognition. In Proceedings of the ACM International Conference on Image and Video Retrieval (pp. 151-157). ACM (Zheng, F., Shao, L. and Song, Z. 2010, 151); Chen, Z., 2012. Discovery of informative and predictive patterns in dynamic networks of complex systems. North Carolina State University (Chen, Z. 2012, 56); Assi, K.C., Labelle, H. and Cheriet, F., 2014. Statistical model based 3D shape prediction of postoperative trunks for non-invasive scoliosis surgery planning. Computers in biology and medicine, 48, pp.85-93 (Assi, K. Labelle, H. and Cheriet, F. 2014, 85); Assi, K.C., Labelle, H. and Cheriet, F., 2014. Modified large margin nearest neighbor metric learning for regression. IEEE Signal Processing Letters, 21(3), pp.292-296 (Assi, K., Labelle, H. and Cheriet, F. 2014, 292); Foytik, J.D., 2011. Locally Tuned Nonlinear Manifold for Person Independent Head Pose Estimation (Doctoral dissertation, University of Dayton) (Foytik, J. 2011, 5); Min, R., 2013. Reconnaissance de visage robuste aux occultations (Doctoral dissertation, Paris, ENST) (Min, R., 2013, 7) and Rui, M.I.N., 2013. Doctorat ParisTech (Doctoral dissertation, TELECOM ParisTech) (Rui, M.I.N. 2013, 3). The fact that all the above authors choose He, X. and Niyogi, P (2004) paper as a reference for their work indicates that they considered the work highly innovative and of high quality. This is reinforced by the fact that some of the articles are peer reviewed and accepted by international standards bodies while some have been published as doctoral dissertations. Technical Quality The work done in this paper by the authors He, X. and Niyogi, P (2004) is of high quality. The authors used existing dimensionality reduction methods to create a new LPP algorithm. They proposed a problem statement where through a step by step procedure, they solved the problem statement of dimensionality reduction using the proposed algorithm. As if that was not enough, the authors then continued to provide a justification that their algorithm was functional through Optimal Linear Embedding, Geometric justification and Kernel LPP where in each they provided an example of a dimensionality reduction problem and provided a step by step guidance on how to solve the problem. He, X. and Niyogi, P (2004) used explained equations throughout their work showing their workings. This is transparency which is a main attribute of high quality work. An independent researcher working on the same field of information processing can simply be able to replicate this paper any section they wish to as long as they have an understanding of basic mathematical symbols, equations, operators and they have the appropriate software to do so. When conducting their research, He, X. and Niyogi, P (2004) obtained their data from primary and secondary sources of information. In primary sources, the authors used raw facts and figures where they formulated their own objective functions and gathered their own data for experiments. In secondary sources, the authors referenced their work from eight peer reviewed articles and publications. The authors selected the articles and journals on the criterion of their usefulness and applicability to their research. The referencing of the work from various sources of information increases the quality of work of the paper. In any research paper, the quality of work can be determined by the number of experiments carried out by the researchers before arriving at a conclusion. A high quality research work conducts more than one experiments for verification purposes in order to arrive at an acceptable conclusion. In their work, He, X. and Niyogi, P (2004)conducted several experiments such as the face recognition where they applied the LPP, LDA and PCA techniques in facial recognition, face images manifold experiment where they used the LPP to face images and lastly they conducted a 2-dimensional visualization experiment employing multiple features database. This experiments helped the authors to arrive at an acceptable conclusion such as that the LPP outperforms other techniques. X-factor The papers X-factor can be explained in terms of appealing results and equations, and the way the authors have solved the defined problem statement. In terms of the results appealing nature, any individual who views the report be it an expert in that field or any other person is drawn by the results section where the results are presented in appealing figures 1 (Simply Synthetic Example), 2 (2-D data visualization) and 3 (2-D data visualization) . Figure 1 Figure 2 Figure 3 Even though figures 1 and 2 are not large like figure 3, their colourful appearance attracts an individual attention to know what the figures are all about. The most capturing figure is figure 3 with human faces. This makes a reader anxious to know why there are so many faces which appear similar but different in one figure. The many equations dominating the first pages of the paper are also appealing to individuals who have dealt with linear and nonlinear equations where the equations fill an individual’s curiosity of knowing what the equations are all about and which problem statement is being solved using such equations. The ability of the paper to appeal to any individual regardless of their technical background because of the papers visual representation indicates that it has an X-factor. In terms of addressing the problem statement, the paper was interesting and could start a discussion in class because the authors explains the entire procedure followed in solving the problem from the formulation of the objective function using the LPP algorithm to solving the problem. The authors also provide a justification for the LPP algorithm using examples which they have worked out. This can engage students in a creative discussion where they try to work out the problem statement using the papers step by step explained equations and solutions as a guide. In the end, students can gain knowledge on ways of minimizing objective functions and how to formulate them. This qualifies as an X-factor. Presentation The papers presentation is of high standard in terms of depth of argument, the presentations clarity and presentation style. In the depth of their arguments, the authors proposes a new LPP algorithm and in order to justify its applicability and functionality they present a problem statement which they solve using the new LPP algorithm. They provide a detailed description of how to integrate the LPP algorithm in the problem statement and formulate an objective function which they then solve. The authors also provide three methods of justifying their algorithm before conducting experiments to demonstrate the practical application of the algorithm. In the presentations clarity, the way the authors of the paper describe and show the procedures to be followed when formulating an objective function from the problem statement provides clarity to the reader or other researchers to an extent where an individual with a technical background can replicate the work due to its transparency. The application of the LPP algorithm in the experiments also provides individuals with clarity regarding its practical applicability and validity in pattern classification applications, information retrieval and exploratory data analysis. The presentation style (verbal) was also easy to understand. In cases where the authors used technical terms that required an explanation, they simply provided a reference for further reading. Other technical terms regarding the LPP algorithm were explained by the authors. In cases where technical jargon could be avoided, the authors used general language which reduced the technical jargon and made the papers language concise and clear. In Visual representation of figures, the authors provided a detailed explanation of the figures, referenced them and annotated them appropriately. Suggested improvements Any research work despite its presentation been of high standard does not necessarily mean that the work is 100 % completely perfect. In this case He, X. and Niyogi, P (2004) research paper was not easy to follow and the suggested improvements include: When writing a research report it is usually important to explain what every section of the report entails by providing a brief description, in the case of this research paper after some section titles the authors did not provide a brief description of what that section entails. This makes the report hard for an individual without an understanding of the proposed algorithm to understand what that section entails. For example, on the Justification section, the authors did not provide a brief description on what that section entails. Another suggestion entails the arrangement of the overall research report. There is a standard outline that research reports follow where the author guides the reader in a chronological order of tasks and activities. In He, X. and Niyogi, P (2004) report, Locality Preserving Projections, the authors should not have listed the different perspectives as to why the new LPP algorithm is interesting at the introduction part. This is because the reader has not even yet fully comprehended what the algorithm entails, how it functions and how it outperforms other dimensionality reduction techniques. Therefore this part should be just before the conclusion and after the experimental results. In the experiments and experimental results sections the authors did not separate the two or arrange them appropriately. The results are explained before the experiments section. And in some areas, both the experiments and results are combined together. The authors should have separated the experiments section from the results section with the experiments section appearing before the experimental results section. Application This papers research work is not limited to future applications. This is because the papers main aim was to propose a new LPP algorithm that could be applied in dimensionality reduction. As technological advances continue to be experienced, huge amounts of data will continue to accumulate in the future. Bearing in mind that information is power and data is becoming a major trading commodity where organizations are dealing with selling of data, new methods for data and information processing will be required. Therefore the LPP algorithm can be used by the data companies for information processing. Another domain where the research work can be applied is the marketing domain where the LPP algorithm is used for pattern recognition. The pattern recognition in this case refers to the consumer preferences where the algorithm identifies a consumption pattern from the products bought by a certain customer, then the pattern is used to predict which product can be marketed to the consumer. The LPP algorithm can also be applied in designing of facial recognition software for security purposes where the LPP algorithm uses pattern recognition to identify individuals as in the case where the government puts the profile of wanted criminals in an online database connected to CCTV surveillance. In case the CCTV cameras captures the criminals face the LPP algorithm integrated in the police database automatically detects the individual using his facial patterns and sound an alert alarm. The LPP algorithm can also be applied in the designing of modern library and archive management systems where once implemented, the system will allow an individual to retrieve information (documents, video and audio) through an indexing scheme that permits and enables retrieval of information quickly since the LPP algorithm design preserves the local structure. Conclusion In general, the article by He, X. and Niyogi, P (2004) qualifies to be of high standard. The article proposed an interesting new linear LPP algorithm for dimensionality reduction in information processing. Due to this the paper was considered highly innovative, well presented verbally and well proven. The information presented in the article is helpful and it can be recommended that individuals researching dimensionality reduction algorithms seek this article for referencing purposes. References Assi, K.C., Labelle, H. and Cheriet, F., 2014. Modified large margin nearest neighbor metric learning for regression. IEEE Signal Processing Letters, 21(3), pp.292-296. Assi, K.C., Labelle, H. and Cheriet, F., 2014. Statistical model based 3D shape prediction of postoperative trunks for non-invasive scoliosis surgery planning. Computers in biology and medicine, 48, pp.85-93. Belkin, M. and Niyogi, P., 2001, December. Laplacian eigenmaps and spectral techniques for embedding and clustering. In NIPS (Vol. 14, pp. 585-591). Chen, Z., 2012. Discovery of informative and predictive patterns in dynamic networks of complex systems. North Carolina State University. Foytik, J.D., 2011. Locally Tuned Nonlinear Manifold for Person Independent Head Pose Estimation (Doctoral dissertation, University of Dayton). He, X. and Niyogi, P., Locality preserving projections. Cambridge, MA, 2004. Min, R., 2013. Reconnaissance de visage robuste aux occultations (Doctoral dissertation, Paris, ENST). Rui, M.I.N., 2013. Doctorat ParisTech (Doctoral dissertation, TELECOM ParisTech). Zheng, F., Shao, L. and Song, Z., 2010, July. Eigen-space learning using semi-supervised diffusion maps for human action recognition. In Proceedings of the ACM International Conference on Image and Video Retrieval (pp. 151-157). ACM. Read More
Cite this document
  • APA
  • MLA
  • CHICAGO
(Locality Preserving Projection by He and Niyogi Article, n.d.)
Locality Preserving Projection by He and Niyogi Article. https://studentshare.org/information-technology/2067378-understanding-the-literature
(Locality Preserving Projection by He and Niyogi Article)
Locality Preserving Projection by He and Niyogi Article. https://studentshare.org/information-technology/2067378-understanding-the-literature.
“Locality Preserving Projection by He and Niyogi Article”. https://studentshare.org/information-technology/2067378-understanding-the-literature.
  • Cited: 0 times

CHECK THESE SAMPLES OF Locality Preserving Projection by He and Niyogi

Integrating Values - The Legality, Morality, and Social Responsibility of Nike

The importance of ethics and social responsibility in business has been the subject of debate since time immemorial.... For many businessmen, ethical values are not very pertinent to the business.... However over the period of time, legal and ethical responsibilities are started to be regarded as one of the prime decisive factors of a business's success (Ferrell, Fraedrich & Ferrell, 2009)....
9 Pages (2250 words) Essay

The Morality, Humanity, and Legality

Recently Susie was diagnosed with a terminal condition.... The prognosis is a grim one.... As the disease progresses it will bring, pain, degeneration, and ultimately complete immobility before inevitable death.... hellip; Your Name Instructor's Name Course Name Due Date Recently Susie was diagnosed with a terminal condition....
6 Pages (1500 words) Essay

Preserving Life for Biomes

The paper "preserving Life for Biomes" highlights that a big amount of people in United States spend time watching wildlife than they do watch movies or sports”.... The matter of preserving wildlife biodiversity should be a concern for humanity.... nbsp; One way of preserving such wildlife is the process of re-wilding environments and habitats with their proper wildlife....
1 Pages (250 words) Essay

Privacy-Preserving Data-as-a-Service Mashups

This essay "Privacy-preserving Data-as-a-Service Mashups" is critical in providing information on the greedy algorithm to the providers of DaaS mashup.... These providers are critical in preserving the security and privacy of the resultant mashup data.... Privacy-preserving DaaSMashup framework should ensure that different DaaS providers have a platform where they can share data from their database with other DaaS on a secure platform....
13 Pages (3250 words) Essay

Preserving land for wildlife

Since the land forms the biggest natural habitat for a vast majority of wildlife, it is critical to… Perhaps the question that lingers in everyone's mind is who bears the responsibility of preserving land.... Perhaps the question that lingers in everyone's mind is who bears the responsibility of preserving land.... In as much as the government bear the greatest responsibility of preserving land that is found within its jurisdiction, it is important to note that individual citizens are also taxed with equal responsibility (Environmental Law Institute, 2003)....
2 Pages (500 words) Essay

State Protection Refugees Fleeing From Prosecution

Asylum seekers fleeing serious abuses of human rights is among the world's most serious humanitarian issues.... The paper "State Protection Refugees Fleeing From Prosecution" offers a critical appraisal of the international law relative to states' responsibility to protect refugees from persecution....
10 Pages (2500 words) Term Paper

Preserving the Three Pillars of Attention

The writer of the paper “preserving the Three Pillars of Attention” states that College students should make choices such as having a clear vision, avoid multitasking, and enhance their information literacy to maintain the three pillars of attention.... nbsp; … Human beings are faced with continued distractions that reduce the capability to improve their knowledge despite having adequate resources and technology....
5 Pages (1250 words) Book Report/Review

Preserving Macromolecules in the Field

The idea of this paper "preserving Macromolecules in the Field" emerged from the author's interest and fascination in how effective is Novel zinc-based fixative (Z7) as a cheap alternative media for preserving specimens (macromolecules) in the field.... This has created the need to come up with a suitable cost-effective alternative media for preserving materials in the field: Novel zinc-based fixative (NZBF) (Z7)....
10 Pages (2500 words) Research Proposal
sponsored ads
We use cookies to create the best experience for you. Keep on browsing if you are OK with that, or find out how to manage cookies.
Contact Us