But as data has grown even larger and more complex, many computer scientists have asked: Is the JL lemma really the best approach to pre-process large data into a manageably low dimension for algorithmic processing? (sciencedaily.com)
In a paper presented this week at the annual IEEE Symposium on Foundations of Computer Science in Berkeley, California, Nelson and co-author Kasper Green Larsen, of Aarhus University in Denmark, found that the JL lemma really is the best way to reduce the dimensionality of data. (sciencedaily.com)
Harvard computer scientist found that the Johnson - Lindenstrauss lemma, a 30 - year - old theorem, is the best approach to pre-process large data into a manageably low dimension for algorithmic processing. (sciencedaily.com)