AnGenMap

Archived Post

From listmasteranimalgenome.org  Sat Aug  3 21:51:28 2019
Return-Path: <listmasteranimalgenome.org>
From: "Walsh, Bruce - (jbwalsh)" <jbwalshemail.arizona.edu>
Subject: Re: Guide on deep learning
Postmaster: submission approved by list moderator
To: Members of AnGenMap <angenmapanimalgenome.org>
Date: Sat, 03 Aug 2019 21:51:28 -0500

Dan:

You are not thinking DEEP enough.  There are only 9 types of graphs, so
you can tweet 9, 3, 2,,4, and 5, and hence convey to the reader the entire
structure of the data and the analysis in just five characters (more if you
want to add commas and "and")


Bruce Walsh
Professor, Ecology and Evolutionary Biology
Professor, Public Health
Professor, BIO5 Institute
Professor, Plant Sciences
Adjunct Professor, Animal and Comparative Biomedical Sciences
Adjunct Professor, Molecular and Cellular Biology
Member, Graduate Committees on Applied Math, Insect Sciences, Genetics,
Statistics
University of Arizona

Evolution and Selection of Quantitative Traits (Oxford 2018)
https://global.oup.com/...0198830870
https://www.amazon.com/...p/product/0198830874

Genetics and Analysis of Quantitative Traits (Sinauer <Oxford>  1998)
https://global.oup.com/...0878934812
https://www.amazon.com/...0878934812

Goggle Scholar
https://scholar.google.com/...l=en&oi=ao

________________________________
.From: DANIEL GIANOLA <gianolaansci.wisc.edu>
.Sent: Friday, August 2, 2019 3:05 PM
.To: Members of AnGenMap <angenmapanimalgenome.org>
.Subject: Re: Guide on deep learning

Joanna,

You asked

"BTW Does anyone read whole papers anymore?"

Yes. In the "Journal of DEEP tweeting". It is great: people go to meetings
and tweet stuff. Then a CONVOLUTIONAL DEEP NETWORK classifies the tweet as
"boring","fair", "wow". Graphs are encouraged, but tweets longer than ten
words are discouraged, because 95% of the reviewers, whose tweeted reports
are fed into the DEEP network, have attention deficit disorder. It has been
reported that they tweet while reviewing tweets submitted to the journal.

If you pay $3000, the accepted tweet is an open access tweet. For an
additional $3000, a drone-based system will correct dyslexia and conjugation
issues in tweets such as "Me Tarzan, you Jane. Chita stop tweeting!"

Instructions online include a manual: "The Twitter KIng", Non-Singularity
University Press. The author is anonymous but owns a bulding called "The
Tweet Tower". The TWEETography is done automatically, which is great.

Dan
Tweetholacomoteva
__________________________________________________________


On Thu, 1 Aug 2019 at 20:46, Ignacy Misztal <ignacyuga.edu> wrote:

> Very funny but relevant message by Daniel. Is SHALLOW DEEPly out of
> fashion?
>
> Miguel Perez-Enciso visited UGA a couple of weeks ago and gave a short
> course on machine learning. We had many discussions, and although Miguel is
> capable of sophisticated theories, he adapts to new (good or bad) times.
>
> Some 25+ years ago the animal breeding was full of sophisticated theories
> with long derivations. Write a model, derive derivatives, estimate
> parameters... Took a guru to derive anything, but there were many gurus at
> that time. E.g., see formulas for threshold models (Gianola, Fouley,
> Hoeschele,..) or for AI REML (Thompson, Jensen,..). Brrrrrr
>
> Next came models based on sampling (sometimes termed Bayesian although BLUP
> is Bayesian too) strongly advocated by Daniel Gianola. Write a model but
> sample instead of deriving much. What needed pages of formulas could then
> be described in few formulas for sampling. Combinations of
> threshold/linear/censored models are now trivial to implement! Except that
> costs are high, results need interpretation, and the methodology is not
> really suitable for large evaluations. But good for papers and requiring
> less class time for students.
>
> The next step is machine learning. Do not even formulate a model, let
> machine do it from a list of cryptic choices. Would you trust machine
> learning in commercial decisions in animal breeding? Not sure but DEEP is
> good for papers and requires even less class time. Also, DEEP is good when
> you need to have results (of whatever quality) in seconds.
>
> Recently, the US public television (PBS) produced a program "Nova:
> Predictions by the numbers" that cited Fisher as originator of statistical
> significance, also commenting on the Bayes theorem. The program was ranked
> by viewers as 2 out of 5. Complexity seems to be out of fashion nowadays.
>
> Ignacy Misztal
>
>> -----Original Message-----
>> From: DANIEL GIANOLA <gianolaansci.wisc.edu>
>> Sent: Monday, July 22, 2019 12:11 PM
>> To: Members of AnGenMap <angenmapanimalgenome.org>
>> Subject: Re: Guide on deep learning
>>
>> A DEEP THANK YOU, Miguel and Laura. Hopefully, we will all learn DEEPly,
>> and see if knowledge of the DEEP architecture of complex traits can ever
>> be learned.
>>
>> We certainly cannot predict with the SHALLOW y=Xb+e (GWAS), and it is
>> argued that if the whole-genome is implicated, then GWAS is not informative
>> (Tam et al. 2019, Nat. Rev. Genetics). That line of reasoning implies that
>> perhaps we cannot learn much with the less shallow SHALLOW y=Xb+Zu+e--all
>> loci get implicated-- although predict we certainly can!
>>
>> Maybe DEEP learning will implicate the whole-genome, the whole
>> interactome, >> the increasing army of DEEP phenotypes (images, sounds,
>> odor--good and bad ones, whatsapp, etc), and eventually, a DEEP MULTI-OMICS
>> SINGLE STAGE BLUP will become a focal topic in the next five years. Perhaps,
>> Ignacy Misztal will extend APY to DEEPY and Cheng-Fernando-Garrick will
>> develop DEEP Bayes Cpi- pipi-pipipi, etc. Here pi means first layer, pipi
>> means the second layer, and pipipipi stands for higher-dimensional stuff.
>>
>> Another possibility is that we will drown in a DEEP sea of confusion,
>> akin to the situation of choosing a beer in a good Belgian brasserie or
>> extracting a meaningful message from the UN General Assembly.
>>
>>
>> Regards,
>>
>> Daniel
>>
>> ________________________________
>> .From: miguel.perez <miguel.perezuab.es>
>> .Sent: Monday, July 22, 2019 8:02 AM
>> .To: Members of AnGenMap <angenmapanimalgenome.org>
>> .Subject: Guide on deep learning
>>
>> Just in case you are wondering about how deep learning works, we have
>> posted an easy to follow script in github guide
>> https://github.com/...rezenciso/DLpipeline
>> and associated, somewhat more technical reference in Genes, 10, 553.
>>
>> https://www.mdpi.com/2073-4425/10/7/553
>>
>> --
>> ===============================================================
>> Miguel Perez-Enciso
>> ICREA professor
>> Centre for Research in Agricultural Genomics (CRAG) and Facultat de
>> Veterinaria
>> UAB Campus Universitat Autonoma Barcelona Bellaterra
>> E-08193 Spain
>> Tel: +34 935636600 ext 3346
>> Fax: +34 935636601
>> miguel.perezuab.es
>> http://www.icrea.cat/...Enciso-255
>> http://bioinformatics.cragenomica.es/numgenomics/
>> http://scholar.google.es/...r=Lpl_-dcAAAAJ&hl=es
>> http://orcid.org/0000-0003-3524-995X
>> https://github.com/miguelperezenciso/
>> ================================================================


 

 

© 2003-2024: USA · USDA · NRPSP8 · Program to Accelerate Animal Genomics Applications. Contact: Bioinformatics Team