Truth Tide TV UNSEALED 1419 Files ยท 74547 Email Threads
menu
videocam Videos headphones Audio description Documents mail Email analytics Reports article Articles auto_stories Narratives search Search
policy Investigate expand_more
inbox Inbox 74547 send Sent 28705 label All Mail 74547 attach_file Attachments 1907 topic Topics
People
Jeffrey Epstein person
Ghislaine Maxwell person
Bill Clinton person
Alan Dershowitz person
Elon Musk person
Bill Gates person
Ehud Barak person
Reid Hoffman person
Peter Thiel person
Larry Summers person
Prince Andrew person
Steve Bannon person
Masha Bucher person
Jason Calcanis
Michael Wolff person
Noam Chomsky person
Tom Pritzker person
Al Seckel person
Kimbal Musk person
Karyna Shuliak person
Deepak Chopra person
Ken Starr person
Peter Attia person
Jeremy Rubin person
Neri Oxman person
Marvin Minsky person
Lawrence Krauss person
Seth Lloyd person
Boris Nikolic person
Jean Luc Brunel person
Lesley Groff person
Sarah Kellen person
Nadia Marcinkova person
Darren Indyke person
Mark Epstein person
Emad Hanna person
Joscha Bach person
Rich Kahn person
Cecelia Steen
John Amerling person
Sultan Bin Sulayem person
Matthew Hitzik
Peter Mandelson person
groups People directory
74547 threads 209740 messages
arrow_back

Re: MDF

2 messages picture_as_pdf Source PDF
J
Jeffrey Epstein Oct 22, 2013 4:01 PM

I would add the possiblity that each differentiated input has its own encrypted algorithm. and looking at it from too high an altitude provides little info about each one..i.e. optic nerve encrption different than nasal receptors . maybe even a one time code . that allows only the individual to access certain stored info.

J
Joscha Bach Oct 23, 2013 2:41 PM
To
Jeffrey Epstein
Cc
Joi ItoKevin SlavinAn Geshertakashi ikegamiMartin NowakGreg Borenstein

Indeed! Each individual will form its own code, for each modality. On the other hand, these codes do not simply diverge, but they are the result of the individual's adaptation to its own (changing, developing, deteriorating) physiology. The nervous system is designed to extract structure based on the statistical properties of the input, and to compensate for defects. For instance, replacing the fine-grained input provided by the many receptors of the cochlea with a crude implant (today's models sample only a handful of frequencies) will usually result in a subjective experience of continuous auditory perception; splicing the data of a few pixels into the optic nerve of a blind person may allocate those pixels their correct positions within the visual field. An interesting question: what are the limits of the plasticity of the sensory modalities? For instance, could we switch modalities to some extent?

More than hundred years ago, Stratton did a famous experiment, where he wore glasses that turned the world upside down (using prisms). After a few days, his brain adapted and he would perceive everything as being upright again. An experiment that I would like to see one day (and of which I am not aware if someone already tried it): equip a subject with an augmented reality display, for instance Google Glass, and continuously feed a visual depiction of auditory input into a corner of the display. The input should transform the result of a filtered Fourier analysis of the sounds around the subject into regular colors and patterns that can easily be discerned visually. At the same time, plug the ears of the subject (for instance, with noise canceling earplugs and white noise). With a little training, subjects should be able to read typical patterns (for instance, many phonemes) consciously from their sound overlay. But after a few weeks: Could a portion of the visual cortex adapt to the statistical properties of the sound overlay so completely that the subject could literally perceive sounds via their eyes? Could we see music? Could we make use of induced synesthesia to partially replace a lost modality?

Cheers,

Joscha

1419 files from the DOJ Epstein case media release. All files are public records from justice.gov.

Built by Truth Tide TV