Show simple item record

Introduction to machine and deep learning for medical physicists

dc.contributor.authorCui, Sunan
dc.contributor.authorTseng, Huan‐hsin
dc.contributor.authorPakela, Julia
dc.contributor.authorTen Haken, Randall K.
dc.contributor.authorEl Naqa, Issam
dc.date.accessioned2020-06-03T15:22:31Z
dc.date.availableWITHHELD_13_MONTHS
dc.date.available2020-06-03T15:22:31Z
dc.date.issued2020-06
dc.identifier.citationCui, Sunan; Tseng, Huan‐hsin ; Pakela, Julia; Ten Haken, Randall K.; El Naqa, Issam (2020). "Introduction to machine and deep learning for medical physicists." Medical Physics 47(5): e127-e147.
dc.identifier.issn0094-2405
dc.identifier.issn2473-4209
dc.identifier.urihttps://hdl.handle.net/2027.42/155469
dc.publisherThe MIT Press
dc.publisherWiley Periodicals, Inc.
dc.subject.otherdeep learning
dc.subject.othermedical physics
dc.subject.othermachine learning
dc.titleIntroduction to machine and deep learning for medical physicists
dc.typeArticle
dc.rights.robotsIndexNoFollow
dc.subject.hlbsecondlevelMedicine (General)
dc.subject.hlbtoplevelHealth Sciences
dc.description.peerreviewedPeer Reviewed
dc.description.bitstreamurlhttps://deepblue.lib.umich.edu/bitstream/2027.42/155469/1/mp14140_am.pdf
dc.description.bitstreamurlhttps://deepblue.lib.umich.edu/bitstream/2027.42/155469/2/mp14140.pdf
dc.identifier.doi10.1002/mp.14140
dc.identifier.sourceMedical Physics
dc.identifier.citedreferenceIoffe S, Szegedy C. Batch normalization: Accelerating deep network training by reducing internal covariate shift. In: Proceedings of the 32Nd International Conference on International Conference on Machine Learning - Volume 37, ICML- 15. JMLR.org; 2015: 448- 456.
dc.identifier.citedreferenceJordan MI. Chapter 25 - serial order: A parallel distributed processing approach. In: Neural- Network Models of Cognition, Advances in Psychology, Vol. 121, edited by John W. Donahoe and Vivian Packard Dorsel. North- Holland; 1997: 471 - 495.
dc.identifier.citedreferenceCho K, van Merrienboer B, Gulcehre C, et al. Learning Phrase Representations using RNN Encoder- Decoder for Statistical Machine Translation. arXiv e- prints, arXiv:1406.1078; 2014. arXiv:1406.1078 [cs.CL].
dc.identifier.citedreferenceVaswani A, Shazeer N, Parmar N. Attention is all you need. In: Guyon I, Luxburg UV, Bengio S, Wallach H, Fergus R, Vishwanathan S, Garnett R, eds. Advances in Neural Information Processing Systems 30. Red Hook: Curran Associates Inc; 2017: 5998 - 6008.
dc.identifier.citedreferenceLuong M- T, Pham H, Manning CD. Effective approaches to attention- based neural machine translation. CoRR abs/1508.04025;2015, arXiv:1508.04025.
dc.identifier.citedreferenceBahdanau D, Cho K, Bengio Y. Neural machine translation by jointly learning to align and translate; 2014. cite arxiv:1409.0473Comment: Accepted at ICLR 2015 as oral presentation.
dc.identifier.citedreferenceKelvin X, Ba J, Kiros R, et al. Show, attend and tell: neural image caption generation with visual attention. CoRR abs/1502.03044; 2015. arXiv:1502.03044.
dc.identifier.citedreferenceRonneberger O, Fischer P, Brox T. U- net: Convolutional networks for biomedical image segmentation. In: Medical Image Computing and Computer- Assisted Intervention (MICCAI), LNCS, Vol. 9351. Berlin: Springer; 2015: 234 - 241. Available on arXiv:1505.04597 [cs.CV].
dc.identifier.citedreferenceRadford A, Metz L, Chintala S. Unsupervised representation learning with deep convolutional generative adversarial networks; 2015. Cite arxiv:1511.06434Comment: Under review as a conference paper at ICLR 2016.
dc.identifier.citedreferenceMirza M, Osindero S. Conditional generative adversarial nets. CoRR abs/1411.1784; 2014. arXiv:1411.1784.
dc.identifier.citedreferenceChen X, Duan Y, Houthooft R, Schulman J, Sutskever I, Abbeel P. Infogan: Interpretable representation learning by information maximizing generative adversarial nets. In: D. D. Lee, M. Sugiyama, U. V. Luxburg, I. Guyon, and R. Garnett, eds. Advances in Neural Information Processing Systems 29. Red Hook: Curran Associates Inc; 2016: 2172 - 2180.
dc.identifier.citedreferenceZhu J- Y, Park T, Isola P, Efros AA. Unpaired image- to- image translation using cycle- consistent adversarial networks. CoRR abs/1703.10593; 2017. arXiv:1703.10593.
dc.identifier.citedreferenceEmami H, Dong M, Nejad- Davarani S, Glide- Hurst C. Generating synthetic CTS from magnetic resonance images using generative adversarial networks. Med Phys. 2018; 45: 3627 - 3636.
dc.identifier.citedreferenceSutton RS, McAllester DA, Singh SP, Mansour Y. Policy gradient methods for reinforcement learning with function approximation. In: Advances in Neural Information Processing Systems; 2000:1057- 1063.
dc.identifier.citedreferenceEfron B, Tibshirani R. An Introduction to the Bootstrap, Monographs on Statistics and Applied Probability. New York: Chapman & Hall; 1993: xvi, 436 p.
dc.identifier.citedreferenceCollins GS, Reitsma JB, Altman DG, Moons KGM. Transparent reporting of a multivariable prediction model for individual prognosis or diagnosis (TRIPOD): the TRIPOD statement. Ann Intern Med. 2015; 162: 55 - 63.
dc.identifier.citedreferenceRatcliffe S. Roy amara; 2016.
dc.identifier.citedreferenceLuo Y, Tseng H- H, Cui S, Wei L, Ten Haken RK, El Naqa I. Balancing accuracy and interpretability of machine learning approaches for radiation treatment outcomes modeling. BJR- Open 1. 2019; 1: 20190021.
dc.identifier.citedreferenceTannenbaum C, Ellis RP, Eyssel J, Zou F, Schiebinger L. Sex and gender analysis improves science and engineering. Nature. 2019; 575: 137 - 146.
dc.identifier.citedreferenceManrai AK, Funke BH, Rehm HL, et al. Genetic misdiagnoses and the potential for health disparities. N Engl J Med. 2016; 375: 655 - 665.
dc.identifier.citedreferenceConforti F, Pala L, Bagnardi V, et al. Cancer immunotherapy efficacy and patients- sex: a systematic review and meta- analysis. Lancet Oncol. 2018; 19: 737 - 746.
dc.identifier.citedreferenceGhorbani A, Zou J. Data shapley: Equitable valuation of data for machine learning; 2019. arXiv:1904.02868 [stat.ML].
dc.identifier.citedreferenceAbouelmehdi K, Beni- Hessane A, Khaloufi H. Big healthcare data: preserving security and privacy. J Big Data. 2018; 5: 1.
dc.identifier.citedreferenceBi WL, Hosny A, Matthew B, et al. Artificial intelligence in cancer imaging: clinical challenges and applications. CA. 2019; 69: 127 - 157.
dc.identifier.citedreferenceLevine AB, Schlosser C, Grewal J, Coope R, Jones SJM, Yip S. Rise of the machines: advances in deep learning for cancer diagnosis. Trends Cancer. 2019; 5: 157 - 169.
dc.identifier.citedreferenceSahiner B, Pezeshk A, Hadjiiski LM, et al. Deep learning in medical imaging and radiation therapy. Med Phys. 2019; 46: e1 - e36.
dc.identifier.citedreferenceNikolov S, Blackwell S, Mendes R, et al. Deep learning to achieve clinically applicable segmentation of head and neck anatomy for radiotherapy; 2018.
dc.identifier.citedreferenceTseng H- H, Luo Y, Ten Haken RK, El Naqa I. The role of machine learning in knowledge- based response- adapted radiotherapy. Front Oncol. 2018; 8: 266.
dc.identifier.citedreferenceShiraishi S, Moore KL. Knowledge- based prediction of three- dimensional dose distributions for external beam radiotherapy. Med Phys. 2016; 43: 378 - 387.
dc.identifier.citedreferenceValdes G, Simone CB, Chen J, et al. Clinical decision support of radiotherapy treatment planning: a data- driven machine learning strategy for patient- specific dosimetric decision making. Radiother Oncol. 2017; 125: 392 - 397.
dc.identifier.citedreferenceEl Naqa I, Irrer J, Ritter TA, et al. Machine learning for automated quality assurance in radiotherapy: a proof of principle using EPID data description. Med Phys. 2019; 46: 1914 - 1921.
dc.identifier.citedreferenceGood D, Joseph LW, Robert LQ, Jackie W, Yin F- F, Das SK. A knowledge- based approach to improving and homogenizing intensity modulated radiation therapy planning quality among treatment centers: an example application to prostate cancer planning. Int J Radiat Oncol Biol Phys. 2013; 87: 176 - 181.
dc.identifier.citedreferenceAvanzo M, Stancanello J, El Naqa I. Beyond imaging: the promise of radiomics. Phys Med. 2017; 38: 122 - 139.
dc.identifier.citedreferenceMattonen SA, Palma DA, Johnson C, et al. Detection of local cancer recurrence after stereotactic ablative radiation therapy for lung cancer: physician performance versus radiomic assessment. Int J Radiat Oncol Biol Phys. 2016; 94: 1121 - 1128.
dc.identifier.citedreferenceDeist TM, Dankers FJWM, Valdes G, et al. Machine learning algorithms for outcome prediction in (chemo)radiotherapy: an empirical comparison of classifiers. Med Phys. 2018; 45: 3449 - 3459.
dc.identifier.citedreferenceCui S, Luo Y, Hsin Tseng H, Ten Haken RK, El Naqa I. Artificial neural network with composite architectures for prediction of local control in radiotherapy. IEEE Trans Radiat Plasma Med Sci. 2019; 3: 242 - 249.
dc.identifier.citedreferenceCui S, Luo Y, Tseng HH, Ten Haken RK, El Naqa I. Combining handcrafted features with latent variables in machine learning for prediction of radiation- induced lung damage. Med Phys. 2019; 46: 2497 - 2511.
dc.identifier.citedreferenceMen K, Geng H, Zhong H, Fan Y, Lin A, Xiao Y. A deep learning model for predicting xerostomia due to radiation therapy for head and neck squamous cell carcinoma in the RTOG 0522 clinical trial. Int J Radiat Oncol Biol Phys. 2019; 105: 440 - 447.
dc.identifier.citedreferenceIbragimov B, Toesca D, Chang D, Yuan Y, Koong A, Xing L. Development of deep neural network for individualized hepatobiliary toxicity prediction after liver SBRT. Med Phys. 2018; 45: 4763 - 4774.
dc.identifier.citedreferenceSchuler T, Kipritidis J, Eade T, et al. Big data readiness in radiation oncology: an efficient approach for relabeling radiation therapy structures with their TG- 263 standard name in real- world data sets. Adv Radiat Oncol. 2019; 4: 191 - 200.
dc.identifier.citedreferenceMeyer P, Noblet V, Mazzara C, Lallement A. Survey on deep learning for radiotherapy. Comput Biol Med. 2018; 98: 126 - 146.
dc.identifier.citedreferenceTseng HH, Wei L, Cui S, Luo Y, Ten Haken RK, El Naqa I. Machine learning and imaging informatics in oncology. Oncology. 2018; 1 - 19. https://doi.org/10.1159/000493575
dc.identifier.citedreferenceEl Naqa I. Perspectives on making big data analytics work for oncology. Methods. 2016; 111: 32 - 44.
dc.identifier.citedreferenceMcNutt TR, Bowers M, Cheng Z, et al. Practical data collection and extraction for big data applications in radiotherapy. Med Phys. 2018; 45: e863 - e869.
dc.identifier.citedreferenceKang J, Schwartz R, Flickinger J, Beriwal S. Machine learning approaches for predicting radiation therapy outcomes: a clinician’s perspective. Int J Radiat Oncol Biol Phys. 2015; 93: 1127 - 1135.
dc.identifier.citedreferenceBibault J- E, Giraud P, Burgun A. Big data and machine learning in radiation oncology: state of the art and future prospects. Cancer Lett. 2016; 382: 110 - 117.
dc.identifier.citedreferenceFeng M, Valdes G, Dixit N, Solberg TD. Machine learning in radiation oncology: opportunities, requirements, and needs. Front Oncol. 2018; 8: 110.
dc.identifier.citedreferenceBoldrini L, Bibault J- E, Masciocchi C, Shen Y, Bittner M- I. Deep learning: a review for the radiation oncologist. Front Oncol. 2019; 9: 977.
dc.identifier.citedreferenceThompson RF, Valdes G, Fuller CD, et al. Artificial intelligence in radiation oncology: a specialty- wide disruptive transformation? Radiother Oncol. 2018; 129: 421 - 426.
dc.identifier.citedreferenceSahiner B, Pezeshk A, Hadjiiski LM, et al. Deep learning in medical imaging and radiation therapy. Med Phys. 2019; 46: e1 - e36.
dc.identifier.citedreferenceDomingos P. A few useful things to know about machine learning. Commun ACM. 2012; 55: 78 - 87.
dc.identifier.citedreferenceMohri M, Rostamizadeh A, Talwalkar A. Foundations of Machine Learning. Cambridge, MA: The MIT Press; 2012.
dc.identifier.citedreferenceKotsiantis SB. Supervised machine learning: a review of classification techniques. In: Real W, ed. Proceedings of the 2007 Conference on Emerging Artificial Intelligence Applications in Computer Engineering. Amsterdam, The Netherlands: IOS Press; 2007: 3 - 24.
dc.identifier.citedreferenceEl Naqa I, Kerns SL, Coates J, et al. Radiogenomics and radiotherapy response modeling. Phys Med Biol. 2017; 62: R179.
dc.identifier.citedreferencePeng CYJ, Lee KL, Ingersoll GM. An introduction to logistic regression analysis and reporting. J Educ Res. 2002; 96: 3 - 14.
dc.identifier.citedreferenceHearst M. Support vector machines. IEEE Intell Syst. 1998; 13: 18 - 28.
dc.identifier.citedreferenceBreiman L. Random forests. Mach Learn. 2001; 45: 5 - 32.
dc.identifier.citedreferencePriddy KL, Keller PE. Artificial Neural Networks: An Introduction (Vol. 68 ). Bellingham: SPIE Press; 2005.
dc.identifier.citedreferenceEmre Celebi M, Aydin K. Unsupervised Learning Algorithms, 1st edn. Berlin: Springer Publishing Company; 2016.
dc.identifier.citedreferenceHartigan JA, Wong MA. A k- means clustering algorithm. J STOR. 1979; 28: 100 - 108.
dc.identifier.citedreferencePutora PM, Peters S, Bovet M. Informatics in Radiation Oncology. In: El Naqa I, Li R, Murphy MJ, eds. Machine Learning in Radiation Oncology: Theory and Applications. Cham, Switzerland: Springer; 2015: 57 - 70.
dc.identifier.citedreferenceAbdi H, Williams LJ. Principal component analysis. WIREs Comput Stat. 2010; 2: 433 - 459.
dc.identifier.citedreferencevan der Maaten L, Hinton G. Visualizing data using t- SNE; 2008.
dc.identifier.citedreferenceBaldi P. Autoencoders, unsupervised learning and deep architectures. In: Proceedings of the 2011 International Conference on Unsupervised and Transfer Learning Workshop - Volume 27, UTLW- 11. JMLR.org; 2011:37- 50.
dc.identifier.citedreferenceSutton RS, Barto AG. Reinforcement Learning: An Introduction, Vol. 1. Cambridge: MIT Press; 1998.
dc.identifier.citedreferencePuterman ML. Markov Decision Processes: Discrete Stochastic Dynamic Programming, 1st edn. New York, NY: John Wiley & Sons Inc.; 1994.
dc.identifier.citedreferenceSilver D, Huang A, Maddison CJ, et al. Mastering the game of go with deep neural networks and tree search. Nature. 2016; 529: 484 - 503.
dc.identifier.citedreferenceTseng H- H, Luo Y, Sunan C, et al. Deep reinforcement learning for automated radiation adaptation in lung cancer. Med Phys. 2017; 44: 6690 - 6705.
dc.identifier.citedreferenceLeCun Y, Bengio Y, Hinton G. Deep learning. Nature. 2015; 521: 436 - 444.
dc.identifier.citedreferenceKrizhevsky A, Sutskever I, Hinton GE. Imagenet classification with deep convolutional neural networks. In: Pereira F, Burges CJC, Bottou L, Weinberger KQ, eds. Advances in Neural Information Processing Systems 25. Red Hook, NY: Curran Associates Inc; 2012: 1097 - 1105.
dc.identifier.citedreferenceMikolov T, Karafiát M, Burget L, Cernocký JH, Khudanpur S. Recurrent neural network based language model. In: INTERSPEECH, edited by Takao Kobayashi, Keikichi Hirose, and Satoshi Nakamura. ISCA; 2010:1045- 1048.
dc.identifier.citedreferenceGuyon I, Elisseeff A. An introduction to variable and feature selection. J Mach Learn Res. 2003; 3: 1157 - 1182.
dc.identifier.citedreferenceAckley DH, Hinton GE, Sejnowski TJ. A learning algorithm for Boltzmann machines. Cogn Sci. 1985; 9: 147 - 169.
dc.identifier.citedreferenceHinton GE, Osindero S, Teh Y- W. A fast learning algorithm for deep belief nets. Neural Comput. 2006; 18: 1527 - 1554.
dc.identifier.citedreferenceNair V, Hinton GE. Rectified linear units improve restricted boltzmann machines. In: Proceedings of the 27th international conference on machine learning (ICML- 10); 2010: 807 - 814.
dc.identifier.citedreferenceGlorot X, Bordes A, Bengio Y. Deep sparse rectifier neural networks. In: Proceedings of the fourteenth international conference on artificial intelligence and statistics; 2011: 315 - 323.
dc.identifier.citedreferenceKingma D, Ba J. Adam: a method for stochastic optimization. arXiv preprint arXiv:1412.6980; 2014.
dc.identifier.citedreferenceHe K, Zhang X, Ren S, Sun J. Deep residual learning for image recognition. arXiv preprint arXiv:0706.1234; 2015.
dc.identifier.citedreferenceHochreiter S, Schmidhuber J. Long short- term memory. Neural Comput. 1997; 9: 1735 - 1780.
dc.identifier.citedreferenceKingma DP, Welling M. Auto- encoding variational Bayes; 2013. Cite arxiv:1312.6114.
dc.identifier.citedreferenceGoodfellow I, Pouget- Abadie J, Mirza M, et al. Generative adversarial nets. In: Advances in Neural Information Processing Systems; 2014: 2672 - 2680.
dc.identifier.citedreferenceEsteva A, Kuprel B, Novoa RA, et al. Dermatologist- level classification of skin cancer with deep neural networks. Nature. 2017; 542: 115 - 118.
dc.identifier.citedreferenceDalmiŠMU, Litjens G, Holland K, et al. Using deep learning to segment breast and fibroglandular tissue in MRI volumes. Med Phys. 2017; 44: 533 - 546.
dc.identifier.citedreferenceNg AY, Jordan MI. On discriminative vs. generative classifiers: a comparison of logistic regression and naive bayes. In: Proceedings of the 14th International Conference on Neural Information Processing Systems: Natural and Synthetic, NIPS- 01. Cambridge, MA: MIT Press; 2001: 841 - 848.
dc.identifier.citedreferenceWolterink JM, Dinkla AM, Savenije MHF, et al. Deep MR to CT syn- thesis using unpaired data. CoRR abs/1708.01155; 2017. arXiv:1708.01155.
dc.identifier.citedreferenceDietterich T. Overfitting and undercomputing in machine learning. ACM Comput Surv. 1995; 27: 326 - 327.
dc.identifier.citedreferenceZhang C, Bengio S, Hardt M, Recht B, Vinyals O. Understanding deep learning requires rethinking generalization; 2017.
dc.identifier.citedreferenceBishop CM. Pattern Recognition and Machine Learning (Information Science and Statistics). Berlin, Heidelberg: Springer- Verlag; 2006.
dc.identifier.citedreferenceLinial N, Mansour Y, Rivest RL. Results on learnability and the Vapnik- Chervonenkis dimension. Inf Comput. 1991; 90: 33 - 49.
dc.identifier.citedreferencePerlich C, Provost F, Simonoff JS. Tree induction vs. logistic regression: a learning- curve analysis. J Mach Learn Res. 2003; 4: 211 - 255.
dc.identifier.citedreferencePerez L, Wang J. The effectiveness of data augmentation in image classification using deep learning. CoRR abs/1712.04621; 2017, arXiv:1712.04621.
dc.identifier.citedreferenceTan C, Sun F, Kong T, Zhang W, Yang C, Liu C. A survey on deep transfer learning. arXiv e- prints, arXiv:1808.01974; 2018, arXiv:1808.01974 [cs.LG].
dc.identifier.citedreferenceShin HC, Roth HR, Gao M, et al. Deep convolutional neural networks for computer- aided detection: CNN architectures, dataset characteristics and transfer learning. IEEE Trans Med Imaging. 2016; 35: 1285 - 1298.
dc.identifier.citedreferenceZhen X, Chen J, Zhong Z, et al. Deep convolutional neural network with transfer learning for rectum toxicity prediction in cervical cancer radiotherapy: a feasibility study. Phys Med Biol. 2017; 62: 8246 - 8263.
dc.identifier.citedreferenceWalach J, Filzmoser P, Hron K. Chapter seven- data normalization and scaling: consequences for the analysis in omics sciences. In: Jaumot J, Bedia C, Tauler R, eds. Data Analysis for Omic Sciences: Methods and Applications, Comprehensive Analytical Chemistry, Vol. 82. Amsterdam: Elsevier; y: 165 - 196.
dc.identifier.citedreferenceLeCun Y, Bottou L, Orr GB, Múller KR. Efficient backprop. In: Neural Net- works: Tricks of the Trade, This Book is an Outgrowth of a 1996 NIPS Workshop. London, UK: Springer- Verlag; 1998: 9 - 50.
dc.identifier.citedreferenceRubin DB. An overview of multiple imputation. In: Proceedings of the Survey Research Section, American Statistical Association; 1988:79- 84.
dc.identifier.citedreferenceMcCullagh P, Nelder JA. Generalized Linear Models. London: Chap- man Hall CRC; 1989.
dc.identifier.citedreferenceNg AY. Feature selection, l1 vs. l2 regularization, and rotational invariance. In: Proceedings of the Twenty- first International Conference on Machine Learning, ICML - 04. New York, NY: ACM; 2004:78.
dc.identifier.citedreferenceNatekin A, Knoll A. Gradient boosting machines, a tutorial. Front Neurorobot. 2013; 7: 21.
dc.identifier.citedreferenceHau K. We analyzed 16,625 papers to figure out where AI is headed next. Technical Report; 2019.
dc.identifier.citedreferenceSpringenberg JT, Klein A, Falkner S, Hutter F. Bayesian optimization with robust bayesian neural networks. In: Lee DD, Sugiyama M, Luxburg UV, Guyon I, Garnett R, eds. Advances in Neural Information Processing Systems 29. Red Hook, NY: Curran Associates Inc; 2016: 4134 - 4142.
dc.identifier.citedreferenceHornik K, Stinchcombe M, White H. Multilayer feedforward networks are universal approximators. Neural Netw. 1989; 2: 359 - 366.
dc.identifier.citedreferenceVidal R, Bruna J, Giryes R, Soatto S. Mathematics of deep learning. CoRR abs/1712.04741; 2017. arXiv:1712.04741.
dc.identifier.citedreferenceBrahma PP, Wu D, She Y. Why deep learning works: a manifold dis- entanglement perspective. IEEE Trans Neural Netw Learn Syst. 2016; 27: 1997 - 2008.
dc.identifier.citedreferenceHochreiter S. Untersuchungen zu dynamischen neuronalen netzen. Diploma, Technische Universität München 91; 1991.
dc.identifier.citedreferenceGoller C, Kuchler A. Learning task- dependent distributed representations by backpropagation through structure. In: IEEE International Conference on Neural Networks, Vol. 1. IEEE; 1996: 347- 352.
dc.identifier.citedreferenceClevert AD, Unterthiner T, Hochreiter S. Fast and accurate deep net- work learning by exponential linear units (ELUs). arXiv preprint arXiv:1511.07289; 2015.
dc.identifier.citedreferenceSrivastava N, Hinton GE, Krizhevsky A, Sutskever I, Salakhutdinov R. Dropout: a simple way to prevent neural networks from overfitting. J Mach Learn Res. 2014; 15: 1929 - 1958.
dc.identifier.citedreferenceRuder S. An overview of gradient descent optimization algorithms. CoRR abs/1609.04747; 2016. arXiv:1609.04747.
dc.identifier.citedreferenceJain LC, Medsker LR. Recurrent Neural Networks: Design and Applications, 1st edn. Boca Raton, FL: CRC Press Inc; 1999.
dc.identifier.citedreferenceElman JL. Finding structure in time. Cogn Sci. 1990; 14: 179 - 211.
dc.owningcollnameInterdisciplinary and Peer-Reviewed


Files in this item

Show simple item record

Remediation of Harmful Language

The University of Michigan Library aims to describe library materials in a way that respects the people and communities who create, use, and are represented in our collections. Report harmful or offensive language in catalog records, finding aids, or elsewhere in our collections anonymously through our metadata feedback form. More information at Remediation of Harmful Language.

Accessibility

If you are unable to use this file in its current format, please select the Contact Us link and we can modify it to make it more accessible to you.