While accuracy is a measure of success, Kosinski said he didn't know if it was ethically sound to create
the best algorithmic approach, for fear someone could replicate it, instead opting to use off - the - shelf approaches.
Not exact matches
But as data has grown even larger and more complex, many computer scientists have asked: Is the JL lemma really the
best approach to pre-process large data into a manageably low dimension for
algorithmic processing?
Harvard computer scientist found that the Johnson - Lindenstrauss lemma, a 30 - year - old theorem, is the
best approach to pre-process large data into a manageably low dimension for
algorithmic processing.
Humans can still modify factors as
well, however, these new
approaches are dynamic, rather than the static stop and go of manual
algorithmic manipulation.
And Fastcase's
algorithmic approach to legal research, which doesn't involve tens of thousands of human editors, but involves intelligence software and
good access to the law is built exactly for that.