// Effektstärke der zweifaktoriellen ANOVA mit Messwiederholung in SPSS berechnen // Eine ANOVA vergleicht den Mittelwert zwischen Gruppen. Dies kann auch für die gleiche Gruppe im Zeitablauf geschehen, eine ANOVA mit Messwiederholung also. Wenn es zwei EInflussfaktoren auf eine abhängige Variable gibt, rechnet man eine zweifaktorielle Varianzanalyse. Die Effektstärken geben an, wie stark statistisch signifikante Effekte sind. Zur Berechnung wird r verwendet, also eine Art Korrelationskoeffizient. Zur Einordnung des Effektes werden die Grenzen nach Cohen, J. (1988) Statistical Power Analysis for the Behavioral Sciences, S. 79-80 verwendet. Ein schwacher Effekt liegt ab 0,1vor, ein mittlerer ab 0,3 und ein starker Effekt ab 0,5. Zum Kontext: Die ANOVA testet, ob es einen signifikanten Unterschied der abhängigen Variable zwischen den Zeitpunkten gibt (Nullhypothese: kein Unterschied). Bei einer ANOVA funktioniert dies, Im Gegensatz zum t-Test, auch bei mehr als zwei Zeitpunkten, ohne dass die Wahrscheinlichkeit einen Fehler 1. Art zu begehen über das Alpha-Fehlerniveau (z.B. 5%) steigt. Diese Wahrscheinlichkeit steigt bei "n" Tests auf 1−(0,95)^𝑛. Bei zehn Gruppen ist die Wahrscheinlichkeit einen Fehler 1. Art zu begehen, bereits 40%, sollte man paarweise Vergleiche mittels t-Tests durchgeführen. Mit einer ANOVA umgeht man dieses Problem und zieht jene daher für Mittelwertvergleiche von mehr als zwei Gruppen vor. Bei Fragen und Anregungen zur zweifaktoriellen Varianzanalyse mit Messwiederholung in SPSS, nutzt bitte die Kommentarfunktion. Ob ihr das Video hilfreich fandet, entscheidet ihr mit einem Daumen nach oben oder unten. #statistikampc Um den Kanal zu unterstützen, erledigt eure Einkäufe bei Amazon über meinen Affiliate-Link: http://amzn.to/2iBFeG9
Views: 328 Statistik am PC
Learn the basic concepts of power and sample size calculations. With definitions for alpha levels and statistical power and effect size, a brief look at Stata's interface, and strategies for increasing statistical power, this video is a useful introduction for all subsequent power and sample size videos on the Stata Youtube Channel. Created using Stata 13; new features available in Stata 14. Copyright 2011-2017 StataCorp LLC. All rights reserved.
Views: 60856 StataCorp LLC
My analysis of Gould's review of The Bell Curve. Considering that the cover of the 2nd edition of Gould's book The Mismeasure of Man is tagged with: “The definitive refutation of the argument of The Bell Curve”, in which this review is reprinted, and given the widespread currency of this review in “debunking” The Bell Curve, an in-depth analysis is warranted. TL;DW: It completely fails Follow me on BitChute: https://www.bitchute.com/channel/ro0A2kpjRPD2/ Links: My video on Gould and the g-factor: https://youtu.be/gbKM_BNpzwc Gould's review of TBC: https://www.dartmouth.edu/%7Echance/course/topics/curveball.html The Bell Curve: https://www.amazon.com/Bell-Curve-Intelligence-Structure-Paperbacks/dp/0684824299 The Mismeasure of Man: https://www.amazon.com/Mismeasure-Man-Revised-Expanded/dp/0393314251 A Reanalysis of the Bell Curve: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=225294 Cohen's Statistical Power Analysis for the Behavioral Sciences: http://utstat.toronto.edu/~brunner/oldclass/378f16/readings/CohenPower.pdf The Wilson Effect: https://www.researchgate.net/publication/255692897_The_Wilson_Effect_The_Increase_in_Heritability_of_IQ_With_Age
Views: 225 Modern Heresy
// Einfaktorielle Varianzanalyse (ANOVA) mit Messwiederholung durchführen // Eine ANOVA vergleicht den Mittelwert zwischen Gruppen. Dies kann auch für die gleiche Gruppe im Zeitablauf geschehen, eine ANOVA mit Messwiederholung also. Sie testet, ob es einen signifikanten Unterschied der abhängigen Variable zwischen den Zeitpunkten gibt (Nullhypothese: kein Unterschied). Bei einer ANOVA funktioniert dies, Im Gegensatz zum t-Test, auch bei mehr als zwei Zeitpunkten, ohne dass die Wahrscheinlichkeit einen Fehler 1. Art zu begehen über das Alpha-Fehlerniveau (z.B. 5%) steigt. Diese Wahrscheinlichkeit steigt bei "n" Tests auf 1−(0,95)^𝑛. Bei zehn Gruppen ist die Wahrscheinlichkeit einen Fehler 1. Art zu begehen, bereits 40%, sollte man paarweise Vergleiche mittels t-Tests durchgeführen. Mit einer ANOVA umgeht man dieses Problem und zieht jene daher für Mittelwertvergleiche von mehr als zwei Gruppen vor. Voraussetzungen für die Durchführung einer ANOVA sind wie üblicherweise bei parametrischen Tests: normalverteilte Residuen (https://www.youtube.com/watch?v=Ibl33...), Homoskedastizität (https://www.youtube.com/watch?v=BCNf2...) und unabhängige/unkorrelierte Residuen (https://www.youtube.com/watch?v=OB9PF...). Sollten die Voraussetzungen, insbesondere einer metrisch skalierten abhängigen Variable nicht erfüllt sein, können Mann-Whitney U-Test oder Kruskal-Wallis-Test verwendet werden. Bei der Untersuchung lediglich einer unabhängigen Variable (z.B. Medikamenteinnahme) wird eine einfaktorielle Varianzanalyse durchgeführt. Literatur zur Effektstärke: Cohen, J. (1988) Statistical Power Analysis for the Behavioral Sciences, S. 284-287: https://amzn.to/2JWGxMU Beispiel: ********* Die abhängige Variable ist Kopfschmerz. Die unabhängige Variable die Reichung eines Schmerzmedikaments. Die Experimentalgruppe I hat eine niedrige Medikamentendosis, Experimentalgruppe II einen hohe Medikamentendosis und die Kontrollgruppe bekommt kein Medikament. Bei Fragen und Anregungen zu Einfaktorielle Varianzanalyse (ANOVA) mit Messwiederholung durchführen, nutzt bitte die Kommentarfunktion. Ob ihr das Video hilfreich fandet, entscheidet ihr mit einem Daumen nach oben oder unten. #statistikampc FRAGEN RUND UM SPSS? Alle Antworten im besten SPSS-Buch, das man für Geld kaufen kann: Andy Field, Discovering Statistics using IMB SPSS: http://amzn.to/2FgFFnq Um den Kanal zu unterstützen, erledigt eure Einkäufe bei Amazon über meinen Affiliate-Link: http://amzn.to/2iBFeG9
Views: 10232 Statistik am PC
Medical Student Gregory Kirschen takes us through the history of postpartum depression from hippocrates to present day. References American Psychiatric Association. (2013). Cautionary statement for forensic use of DSM-5. In Diagnostic and statistical manual of mental disorders (5th ed.). doi:10.1176/appi.books.9780890425596.744053 King, H. 1998. Hippocrates’ Woman: Reading the Female Body in Ancient Greece. London: Routledge, 1st ed. https://doi.org/10.4324/9780203025994 Adams, H. (1984). Comprehensive Handbook of Psychopathology. New York: Springer Science + Business Media. P. 319. ISBN 97814615668. Louden, I. 1988. Puerperal insanity in the 19th Century. Journal of the Royal Society of Medicine. 81, p 76-79. Reid, J. 1848. Dr. Reid on Puerperal Insanity. Journal ofPsychological and Medical Mental Pathology. 1(1): 128-151. Meltzer HY, Stahl SM. 1976. The dopamine hypothesis of schizophrenia: a review. Schizophrenia Bulletin. 2(1): 19-76. Hirschfeld, R. 2000. History and evolution of the monoamine hypothesis of depression. The Journal of Clinical Psychiatry. 61(Suppl6), 4-6. Winn, JM. 1855. On the Treatment of Puerperal Mania. Journal of Psychological and Medical Mental Pathology.. 8(30): 309-313. MacLeod, MD. 1886. An address on puerperal insanity. British Medical Journal. 2(1336): 239-242. Theriot, N. 1989. Diagnosing Unnatrual Motherhood: Nineteenth-century Physicians and ‘Puerperal Insanity.’ American Studies. 30(2): 69-88. Henry, WO. 1907. To what extent can the gynecologist prevent and cure insanity in women? JAMA. XLVIII(12): 997-1002. Boyd, R. 1870. Observations on puerperal insanity. Journal of Mental Science. 16(74): 153-165. Clark, AC. 1887. Aetiology, Pathology, and Treatment of Puerperal Insanity. Journal of Mental Science. 33(142): 169-189. Dunn, PM. 2002. Sir James Young Simpson (1811-1870) and obstetric anesthesia. Archies of Disease in Childhood-Fetal and Neonatal edition. 86(3): F207-F209. Tuke, JB. 1867. Cases illustrataive of insanity of pregnancy, puerperal mania, and insanity of lactation. Edinburgh Medical Journal. 12(12): 1083-1101. Donkin, AS. 1863. The Pathological Relation between Albuminuria and Puerperal Mania. Journal of Mental Science. 9(47): 401-405. Miller GE, Cohen S, Ritchey AK. 2002. Chronic psychological stress and the regulation of pro-inflammatory cytokines: a glucocorticoid-resistance model. Health Psychology. 21(6): 531-541. Cassidy-Bushrow AE, Peters RM, Johnson DA, et al. 2012. Association between depressive symptoms with inflammatory biomarkers among pregnant African-American women. Journal of Reproductive Immunology. 94(2): 202-209. O’Mahony SM, Myint AM, van der Hove D, et al. 2006. Gestational stress leads to depressive-like behavioural and immunological changes in the rat. Neuroimmunomodulation. 13: 82-88. Bamford, CB. 1934. An analytical review of a series of cases of insanity with pregnancy. Journal of Mental Science. 80(328): 58-63. Ballantyne, JW. 1892. A series of thirteen cases of alleged maternal impression. Edinburgh Medical Journal. 37(11): 1025-1034. Fisher GJ. 1870. Does Maternal Mental Influence Have any Constructive or Destructive Power in the Production of Malformations or Monstrosities at any Stage of Embryonic Development? American Journal of Insanity. XXVI(III): 241-295. Pohlman AG. 1911. Maternal impression. Proceedings of the Indiana Academy of Science. 65-70. Kundakovic M, Gudsnuk K, Herbstman JB, et al. 2015. DNA methylation of BNDF as a biomarker of early-life adversity.PNAS. 112(22): 6807-6813. Boersma GJ., Lee RS, Cordner ZA, et al. 2014. Prenatal stress decreases Bdnf expression and increases methylation of Bdnf exon IV in rats. Epigenetics. 9(3): 437-447. Zajicek-Farber ML. 2009. Postnatal depression and infant health practices among high-risk women. Journal of Child and Family Studies. 18:236. Ban L, Gibson JE, West J, et al. 2010. Association between perinatal depression in mothers and the risk of childhood infections in offspring: a population-based cohort study. BMC Public Health. 10:799. Ertel KA, Koenen KC, Rich-Edwards JW, et al. 2010. Antenatal and postpartum depressive symptoms are differentially associated with early childhood weight and adiposity. Paediatric and Perinatal Epidemiology. 24: 179-189. Beck CT. 1998. The effects of postpartum depression on child development: A meta-analysis. Archives of Psychiatric Nursing. 12(1): 12-20. Tuovinen S, Lahti-Pulkkinen M, Girchenko P, et al. 2018. Maternal depressive symptoms during and after pregnancy and child developmental milestones. Depression and Anxiety. 35(8): 732-741. Foundeur M, Fixsen C, Tribel WA, et al. 1957. Postpartum Mental Illness: A Controlled Study....
Views: 3 The Ob/Gyn Podcast
The power of a statistical test is the probability that it correctly rejects the null hypothesis when the null hypothesis is false. That is, It can be equivalently thought of as the probability of correctly accepting the alternative hypothesis when the alternative hypothesis is true - that is, the ability of a test to detect an effect, if the effect actually exists. This video is targeted to blind users. Attribution: Article text available under CC-BY-SA Creative Commons image source in video
Views: 59 Audiopedia
In statistics, an effect size is a quantitative measure of the strength of a phenomenon. For example, Kramer, Guillory, & Hancock (2014) showed that when emotionally positive posts were reduced in Facebook users' news feeds, users' status updates used fewer positive words (Cohen's d = 0.02) and more negative words (d = 0.001). Because these effect sizes (d=0.02 and 0.001) are very close to zero, they indicate that this phenomenon is extremely weak; however, the reduction in positive words (d=0.02) was 20 times stronger than the increase in negative words (d=0.001). Effect sizes are calculated from the data of a study and a wide variety of different effect size measures exist because of different kinds of data and differing study methodologies and to quantify different aspects of research data. Effect sizes are practically important in their own right and play an important role in complementing statistical hypothesis testing, in statistical power analyses, and in meta-analyses where effect sizes across several studies are summarized. The concept of effect size already appears in everyday language. For example, a weight loss program may boast that it leads to an average weight loss of 30 pounds. In this case, 30 pounds is the value of the claimed effect size (with the effect size measure being the difference in means [i.e., mean weight before the program minus mean weight after the program]). Another example is that a tutoring program may claim that it raises school performance by one letter grade. This grade increase is the claimed effect size of the program. These are both examples of "absolute effect sizes", meaning that they convey the average difference between two groups without any discussion of the variability within the groups. For example, if the weight loss program results in an average loss of 30 pounds, it is possible that every participant loses exactly 30 pounds, or half the participants lose 60 pounds and half lose no weight at all. This video is targeted to blind users. Attribution: Article text available under CC-BY-SA Creative Commons image source in video
Views: 742 Audiopedia
If You're So Free, Why Do You Follow Others? The Sociological Science Behind Social Networks and Social Influence. Nicholas Christakis, Professor of Medical Sociology, Medicine, and Sociology at Harvard University If you think you're in complete control of your destiny or even your own actions, you're wrong. Every choice you make, every behavior you exhibit, and even every desire you have finds its roots in the social universe. Nicholas Christakis explains why individual actions are inextricably linked to sociological pressures; whether you're absorbing altruism performed by someone you'll never meet or deciding to jump off the Golden Gate Bridge, collective phenomena affect every aspect of your life. By the end of the lecture Christakis has revealed a startling new way to understand the world that ranks sociology as one of the most vitally important social sciences. The Floating University Originally released September 2011. Additional Lectures: Michio Kaku: The Universe in a Nutshell http://www.youtube.com/watch?v=0NbBjNiw4tk Joel Cohen: An Introduction to Demography (Malthus Miffed: Are People the Problem?) http://www.youtube.com/watch?v=2vr44C_G0-o Steven Pinker: Linguistics as a Window to Understanding the Brain http://www.youtube.com/watch?v=Q-B_ONJIEcE Leon Botstein: Art Now (Aesthetics Across Music, Painting, Architecture, Movies, and More.) http://www.youtube.com/watch?v=j6F-sHhmfrY Tamar Gendler: An Introduction to the Philosophy of Politics and Economics http://www.youtube.com/watch?v=mm8asJxdcds
Views: 259437 Big Think
A statistical hypothesis test is a method of statistical inference using data from a scientific study. In statistics, a result is called statistically significant if it has been predicted as unlikely to have occurred by chance alone, according to a pre-determined threshold probability, the significance level. The phrase "test of significance" was coined by statistician Ronald Fisher. These tests are used in determining what outcomes of a study would lead to a rejection of the null hypothesis for a pre-specified level of significance; this can help to decide whether results contain enough information to cast doubt on conventional wisdom, given that conventional wisdom has been used to establish the null hypothesis. The critical region of a hypothesis test is the set of all outcomes which cause the null hypothesis to be rejected in favor of the alternative hypothesis. Statistical hypothesis testing is sometimes called confirmatory data analysis, in contrast to exploratory data analysis, which may not have pre-specified hypotheses. In the Neyman-Pearson framework (see below), the process of distinguishing between the null & alternative hypotheses is aided by identifying two conceptual types of errors (type 1 & type 2), and by specifying parametric limits on e.g. how much type 1 error will be permitted. This video is targeted to blind users. Attribution: Article text available under CC-BY-SA Creative Commons image source in video
Views: 365 Audiopedia
This event brought together a panel of three leading economists—Thomas Piketty, Kevin Murphy, and Steven Durlauf—to discuss the sources of the rise in inequality in advanced industrialized countries over the past 40 years, the problems it poses, and effective responses. Nobel laureate James Heckman moderated the panel and guided the discussion. This event was cosponsored by the Becker Friedman Institute, the Harris School of Public Policy, the Human Capital and Economic Opportunity Global Working Group, and the Center for the Economics of Human Development. If you experience technical difficulties with this video or would like to make an accessibility-related request, please send a message to [email protected]
Views: 8666 Becker Friedman Institute at UChicago - BFI
In statistics, meta-analysis refers to statistical methods for contrasting and combining results from different studies, in the hope of identifying patterns among study results, sources of disagreement among those results, or other interesting relationships that may come to light in the context of multiple studies. Meta analysis can be thought of as "conducting research about previous research." In its simplest form, meta-analysis is done by identifying a common statistical measure that is shared between studies, such as effect size or p-value, and calculating a weighted average of that common measure. This weighting is usually related to the sample sizes of the individual studies, although it can also include other factors, such as study quality. The motivation of a meta-analysis is to aggregate information in order to achieve a higher statistical power for your measure of interest, as opposed to a less precise measure derived from a single study. In performing a meta analysis, an investigator must make choices many that can affect its output, including deciding how to search for studies, selecting studies based on a set of objective criteria, dealing with incomplete data, analyzing the data, and accounting for or choosing not to account for publication bias. This video is targeted to blind users. Attribution: Article text available under CC-BY-SA Creative Commons image source in video
Views: 1903 Audiopedia
IPPCR 2015: A Research Question and Implications for Efficient Clinical Trials Air date: Monday, October 19, 2015, 5:00:00 PM Category: IPPCR Runtime: 01:34:45 Description: The Introduction to the Principles and Practice of Clinical Research (IPPCR) is a course to train participants on how to effectively conduct clinical research. The course focuses on the spectrum of clinical research and the research process by highlighting epidemiologic methods, study design, protocol preparation, patient monitoring, quality assurance, and Food and Drug Administration (FDA) issues. For more information go to https://ippcr.nihtraining.com/login.php Author: John H. Powers, III, M.D., NIAID, NIH Permanent link: http://videocast.nih.gov/launch.asp?19250
Views: 20011 nihvcast
MIT RES.9-003 Brains, Minds and Machines Summer Course, Summer 2015 View the complete course: https://ocw.mit.edu/RES-9-003SU15 Instructor: Surya Ganguli Describes how the application of methods from statistical physics to the analysis of high-dimensional data can provide theoretical insights into how deep neural networks can learn to perform functions such as object categorization. License: Creative Commons BY-NC-SA More information at https://ocw.mit.edu/terms More courses at https://ocw.mit.edu
Views: 1107 MIT OpenCourseWare
What really causes addiction — to everything from cocaine to smart-phones? And how can we overcome it? Johann Hari has seen our current methods fail firsthand, as he has watched loved ones struggle to manage their addictions. He started to wonder why we treat addicts the way we do — and if there might be a better way. As he shares in this deeply personal talk, his questions took him around the world, and unearthed some surprising and hopeful ways of thinking about an age-old problem. TEDTalks is a daily video podcast of the best talks and performances from the TED Conference, where the world's leading thinkers and doers give the talk of their lives in 18 minutes (or less). Look for talks on Technology, Entertainment and Design -- plus science, business, global issues, the arts and much more. Find closed captions and translated subtitles in many languages at http://www.ted.com/translate Follow TED news on Twitter: http://www.twitter.com/tednews Like TED on Facebook: https://www.facebook.com/TED Subscribe to our channel: http://www.youtube.com/user/TEDtalksDirector
Views: 5890286 TED
On Wednesday Sept. 12, the Harvard Law School Library hosted a book talk and discussion in celebration of the recent publication of "Big Data, Health Law, and Bioethics," edited by I. Glenn Cohen, Holly Fernandez Lynch, Urs Gasser, and Effy Vayena. The talk was co-sponsored by the Petrie-Flom Center for Health Law Policy, Biotechnology and Bioethics and by the Berkman Klein Center for Internet & Society at Harvard University.
Views: 1019 Harvard Law School
As part of the 2018–2019 Fellows’ Presentation Series at the Radcliffe Institute for Advanced Study, Nicole C. Nelson RI ’19 encourages both scholarly and public discussions about notions of scientific fact in the wake of a recent phenomenon wherein scientists have found many supposedly stable findings to be difficult to replicate upon subsequent investigation. Nelson is an assistant professor of science and technology studies in the Department of Medical History and Bioethics at the University of Wisconsin–Madison. She is the 2018–2019 Katherine Hampson Bessell Fellow at the Radcliffe Institute, Harvard University. https://www.radcliffe.harvard.edu/event/2019-nicole-c-nelson-fellow-presentation For information about the Radcliffe Institute and its many public programs, visit https://www.radcliffe.harvard.edu/. Facebook: http://www.facebook.com/RadcliffeInstitute Twitter: http://www.twitter.com/RadInstitute Instagram: http://www.instagram.com/radcliffe.institute
Views: 1335 Harvard University
“How Numbers Lie: Intersectional Violence and the Quantification of Race” Tracing the genealogy of statistical discourses on race, Khalil Gibran Muhammad explores the violence of racial quantification on black women and men’s lives beginning in the postbellum period. Currently the director of the Schomburg Center for Research in Black Culture of the New York Public Library and a visiting professor at the City University of New York, Muhammad will begin his academic appointments as a professor of history, race, and public policy at Harvard Kennedy School and a Suzanne Young Murray Professor at Radcliffe on July 1, 2016. Presented by the Schlesinger Library at the Radcliffe Institute for Advanced Study. Video: Welcome by Lizabeth Cohen, Dean of the Radcliffe Institute Introduction by Jane Kamensky, Pforzheimer Foundation Director of the Schlesinger Library - 6:55 Lecture by Khalil Gibran Muhammad - 15:50
Views: 5587 Harvard University
Aerothermodynamics View the complete course: http://ocw.mit.edu/16-885F05 License: Creative Commons BY-NC-SA More information at http://ocw.mit.edu/terms More courses at http://ocw.mit.edu
Views: 19637 MIT OpenCourseWare
Watch the APA livestream "A Culture Shift Toward Openness and Reproducibility in Psychology" filmed on Wednesday, Oct. 25 at 7:00 pm ET with Brian Nosek, PhD, co-founder and executive director of the Center for Open Science, which operates the Open Science Framework that enables open and reproducible research practices worldwide. Nosek is a professor in the department of psychology at the University of Virginia. He co-founded Project Implicit, a multiuniversity collaboration for research and education investigating implicit cognition – thoughts and feelings that occur outside of awareness or control. Nosek investigates the gap between values and practices, such as when behavior is influenced by factors other than one's intentions and goals. Research applications of this interest include implicit bias, decision-making, attitudes, ideology, morality, innovation, barriers to change, open science and reproducibility. In 2015, he was named on of Nature’s 10 and to the Chronicle of Higher Education Influence list. He received his PhD from Yale University in 2002. This is the fourth in a series of talks being presented by the American Psychological Association to mark its 125th anniversary in 2017. __ The American Psychological Association is the leading scientific and professional organization representing psychology in the United States, with more than 115,700 researchers, educators, clinicians, consultants and students as its members. To learn more about the APA visit http://www.apa.org Follow APA on social media: Facebook https://www.facebook.com/AmericanPsychologicalAssociation/ Twitter https://twitter.com/apa LinkedIn https://www.linkedin.com/company/10738/ Google+ https://plus.google.com/+americanpsychologicalassociation
Views: 720 American Psychological Association
The Prior Art Archive is a new, open access system that allows anyone to upload their prior work and make it easily searchable by patent examiners. (see chapters below) Currently, patent examiners face barriers to finding and reviewing large swaths of existing prior art. This MIT-hosted and publicly accessible archive was created in collaboration with the USPTO and Cisco to address some of those barriers. We aim to make it easy for anyone to get their prior art in front of examiners and, as a result, significantly improve the quality of issued patents. MIT Media Lab Director Joi Ito and USPTO Director Andrei Iancu are leading a half-day launch celebration at the American Academy of Arts and Sciences on October 3. Chapter 1: 00:45 Welcome - Kate Darling, MIT Chapter 2: 04:48 How this project started - Dan Lang, Cisco Chapter 3: 15:55 Why MIT/Media Lab? - Joi Ito, MIT Media Lab Chapter 4: 26:26 The Mission of USPTO & efforts to improve patent quality - Andrei Iancu, USPTO Chapter 5: 43:39 The Prior Art Archive - Travis Rich, MIT Media Lab & Bhaskar Ranade, Cisco Chapter 6: 1:05:39 Colleen V. Chien, Santa Clara University Chapter 7: 1:18:22 - Mike J. Meurer, Boston University Chapter 8: 1:53:25 - Panel: How can we keep improving patent quality and why does it matter? More information at: https://www.media.mit.edu/events/prior-art-archive-launch/ License: CC-BY-4.0 (https://creativecommons.org/licenses/by-nc/4.0/)
Views: 2571 MIT Media Lab
CONTAGION Exploring Modern Epidemics A Radcliffe Institute Science Symposium BIG DATA James M. Wilson (6:10), associate research professor, School of Community Health Sciences, and director, Nevada Medical Intelligence Center, University of Nevada, Reno; director, Ascel Bio National Infectious Disease Forecast Station C. Jessica E. Metcalf (28:53), assistant professor of ecology and evolutionary biology and public affairs, Woodrow Wilson School, Princeton University Ami S. Bhatt (47:01), assistant professor of medicine (hematology) and of genetics, Stanford University School of Medicine Moderator: Caroline Buckee, associate professor of epidemiology, Harvard T.H. Chan School of Public Health PANEL DISCUSSION (1:07:26) AUDIENCE Q&A (1:15:02) For information about the Radcliffe Institute and its many public programs, visit https://www.radcliffe.harvard.edu/. Facebook: http://www.facebook.com/RadcliffeInstitute Twitter: http://www.twitter.com/RadInstitute Instagram: http://www.instagram.com/radcliffe.institute
Views: 1113 Harvard University
Hunt for Alien Earths - 02:00 NOVA scienceNOW visits astronomers who may be on the brink of finding "another Earth" outside our solar system but within our Milky Way galaxy. A new planet-hunting machine, the Kepler telescope, is at their disposal. This and other ingenious technologies could finally answer the age-old question: Are we alone? Art Authentication - 14:42 Vincent van Gogh has inspired several talented artists to turn their hands to forgery. Can computers be used to identify which works are really his? To find out, NOVA scienceNOW, working in cooperation with the Van Gogh Museum in Amsterdam, commissioned an expert in art reconstruction to make a meticulous copy of a van Gogh painting. Then, we challenged three different computer teams—from Princeton, Penn State, and Maastricht universities—to see if they could spot the imitation in a group that included five genuine van Goghs. Profile: Maydianne Andrade - 26:55 Maydianne Andrade's career might seem like something out of a horror film (her favorite genre), but Andrade can't imagine how she would spend her days and nights if not studying the cannibalistic behavior of the Australian redback spider. She has discovered why it makes evolutionary sense for the males of this species, a type of black widow, to make the ultimate sacrifice, what she calls "adaptive suicide." Autism Genes - 36:41 Rudy Tanzi, a pioneer in discovering genes for Alzheimer's disease, is turning his attention to autism. Using gene chips that can scan up to a million genetic markers across the entire human genome, Tanzi and others are on the hunt for the genetic key to a heartbreaking disorder that seems to come out of nowhere and yet affects millions of children and their families. Cosmic Perspective & Planet Hunters - 49:31 Turkish subtitles will be added soon. No copyright intended, all the copyrights belong to PBS NOVA. http://www.pbs.org/wgbh/nova/sciencenow/ Support PBS! http://www.shoppbs.org/family/index.jsp?categoryId=11580318&ab=NOVAscienceNOW Or donate: http://www.pbs.org/about/support-our-mission/
Views: 99053 araniel
President Trump is delivering his second State of the Union address before a joint session of Congress Tuesday night (a president's first address to Congress is not considered to be a State of the Union speech). He will be laying out his vision of the country and goals for his administration before an already divided body after being forced to delay his speech amid a partial government shutdown stemming from disputes over border security. CBSN’s continuous live coverage begins at 5:00 PM, ET, with a special ‘State of the Union’ edition of Red & Blue anchored by Elaine Quijano. At 8:00 PM, ET, Quijano and a panel of CBS News reporters and contributors will preview the speech and discuss the Trump administration’s policies since last year’s address. At 9:00 PM, ET, watch the State of the Union Address and the Democratic response live. Stay with CBSN as our panel of guests and experts deliver a comprehensive analysis. For live updates: https://www.cbsnews.com/live-news/2019-state-of-the-union-live-stream-donald-trump-democratic-rebuttal-live-updates/ -- Subscribe to the CBS News Channel HERE: http://youtube.com/cbsnews Watch CBSN live HERE: http://cbsn.ws/1PlLpZ7 Follow CBS News on Instagram HERE: https://www.instagram.com/cbsnews/ Like CBS News on Facebook HERE: http://facebook.com/cbsnews Follow CBS News on Twitter HERE: http://twitter.com/cbsnews Get the latest news and best in original reporting from CBS News delivered to your inbox. Subscribe to newsletters HERE: http://cbsn.ws/1RqHw7T Get your news on the go! Download CBS News mobile apps HERE: http://cbsn.ws/1Xb1WC8 Get new episodes of shows you love across devices the next day, stream CBSN and local news live, and watch full seasons of CBS fan favorites like Star Trek Discovery anytime, anywhere with CBS All Access. Try it free! http://bit.ly/1OQA29B --- CBSN is the first digital streaming news network that will allow Internet-connected consumers to watch live, anchored news coverage on their connected TV and other devices. At launch, the network is available 24/7 and makes all of the resources of CBS News available directly on digital platforms with live, anchored coverage 15 hours each weekday. CBSN. Always On.
Views: 689392 CBS News
"Teaching for Learning: What I have learned from learning research" When we think of the task of teachers as facilitating learning rather than as delivering information content it leads to a profound shift in how we teach and what we stress in the process. I will talk about the ideas and research base underlying the "Framework for k-12 Science Education" and the vision for "three-dimensional learning" as defined by that document. Over 30 states, including California, are now attempting to implement this vision with their new science standards. I will also discuss how physics education research at the college level argues for a continuation of similar principles and shifts in emphasis for teaching physics (and other sciences) at the college level.
Views: 641 UC Berkeley Events
In statistics, a mediation model is one that seeks to identify and explicate the mechanism or process that underlies an observed relationship between an independent variable and a dependent variable via the inclusion of a third explanatory variable, known as a mediator variable. Rather than hypothesizing a direct causal relationship between the independent variable and the dependent variable, a mediational model hypothesizes that the independent variable influences the mediator variable, which in turn influences the dependent variable. Thus, the mediator variable serves to clarify the nature of the relationship between the independent and dependent variables. In other words, mediating relationships occur when a third variable plays an important role in governing the relationship between the other two variables. Researchers are now focusing their studies on better understanding known findings. Mediation analyses are employed to understand a known relationship by exploring the underlying mechanism or process by which one variable (X) influences another variable (Y) through a mediator (M). For example, suppose a cause X affects a variable (Y) presumably through some intermediate process (M). In other words X leads to M leads to Y. Thus, if gender is thought to be the cause of some characteristic, one assumes that other social or biological mechanisms associated with gender can explain how gender-associated differences arise. Such an intervening variable is called a mediator. This video is targeted to blind users. Attribution: Article text available under CC-BY-SA Creative Commons image source in video
Views: 3938 Audiopedia
Featuring Simine Vazire, William G. Jacoby, Kristin Kanthak, John W. Patty and Brad Jones.
Views: 203 UC Davis Social Sciences
Analysis of variance (ANOVA) is a collection of statistical models used to analyze the differences between group means and their associated procedures (such as "variation" among and between groups), developed by R.A. Fisher. In the ANOVA setting, the observed variance in a particular variable is partitioned into components attributable to different sources of variation. In its simplest form, ANOVA provides a statistical test of whether or not the means of several groups are equal, and therefore generalizes the t-test to more than two groups. As doing multiple two-sample t-tests would result in an increased chance of committing a statistical type I error, ANOVAs are useful in comparing (testing) three or more means (groups or variables) for statistical significance. This video is targeted to blind users. Attribution: Article text available under CC-BY-SA Creative Commons image source in video
Views: 322 Audiopedia
When it comes to the news and the health risks inherent with apparently just living every day life, how do we know what to worry about? Do mobile phones cause brain cancer? Does BPA cause cancer? Is the HPV vaccine safe? There is conflicting information on the safety or lack thereof on nearly everything. To help cut through the false science reports, Geoffrey C. Kabat joins the program today. How can we distinguish real science from poorly reported and misinterpreted stories? What risks are worth worrying about and what should we let go? Subscribe to our YouTube channel: http://well.org/subscribe Subscribe to The Urban Monk Podcast on iTunes: http://theurbanmonk.com/ Subscribe to the Health Bridge Podcast on iTunes: http://well.org/healthbridgepodcast Connect with us: http://well.org/ Facebook - http://well.org/facebook Twitter - http://well.org/twitter Pinterest - http://well.org/pinterest YouTube – https://www.youtube.com/user/wellchannel
Views: 452 Well.org
In statistics, dependence is any statistical relationship between two random variables or two sets of data. Correlation refers to any of a broad class of statistical relationships involving dependence. Familiar examples of dependent phenomena include the correlation between the physical statures of parents and their offspring, and the correlation between the demand for a product and its price. Correlations are useful because they can indicate a predictive relationship that can be exploited in practice. For example, an electrical utility may produce less power on a mild day based on the correlation between electricity demand and weather. In this example there is a causal relationship, because extreme weather causes people to use more electricity for heating or cooling; however, statistical dependence is not sufficient to demonstrate the presence of such a causal relationship (i.e., correlation does not imply causation). This video is targeted to blind users. Attribution: Article text available under CC-BY-SA Creative Commons image source in video
Views: 491 Audiopedia
A scale-free network is a network whose degree distribution follows a power law, at least asymptotically. That is, the fraction P(k) of nodes in the network having k connections to other nodes goes for large values of k as where is a parameter whose value is typically in the range 2 less than less than 3, although occasionally it may lie outside these bounds. This video is targeted to blind users. Attribution: Article text available under CC-BY-SA Creative Commons image source in video
Views: 540 Audiopedia
This talk offers some guidance on how to appropriately use p-values while avoiding some logical pitfalls. A PDF of the slides presented can be found here: https://goo.gl/DsRQg7 Part of the "Biostatistics in Action: Tips for Clinical Researchers" lecture series that is sponsored by the Irving Institute for Clinical and Translational Research - Biostatistics, Epidemiology and Research Design resource, which is supported in part by an NIH Clinical and Translational Science Award (CTSA) through its Center for Advancing Translational Sciences (Grant No, UL1TR001873). The speaker, Bruce Levin, PhD is a Professor in the Department of Biostatistics at the Mailman School of Public Health. Sponsored by: The Irving Institute for Clinical and Translational Research: http://irvinginstitute.columbia.edu/ In affiliation with: The Department of Biostatistics at the Mailman School of Public Health: https://www.mailman.columbia.edu/become-student/departments/biostatistics
Views: 100 BERD Education
In mathematics and computer science, graph theory is the study of graphs, which are mathematical structures used to model pairwise relations between objects. A "graph" in this context is made up of "vertices" or "nodes" and lines called edges that connect them. A graph may be undirected, meaning that there is no distinction between the two vertices associated with each edge, or its edges may be directed from one vertex to another; see graph (mathematics) for more detailed definitions and for other variations in the types of graph that are commonly considered. Graphs are one of the prime objects of study in discrete mathematics. Refer to the glossary of graph theory for basic definitions in graph theory. This video is targeted to blind users. Attribution: Article text available under CC-BY-SA Creative Commons image source in video
Views: 298 Audiopedia
The SEBAS data (available at http://goo.gl/9PNehD) allow researchers to explore the relationships among life challenges, the social environment and health and to examine antecedents, correlates and consequences of change in biological measures and health. This webinar recording (originally broadcast on June 20, 2016) will provide an overview of the unique sources of information available in SEBAS and will show examples of how these data can be used in combination with other biosocial surveys for comparative analyses. The presentation will review the background for the SEBAS, describe the cohort and how the survey builds on the parent study (TLSA), briefly explain the measures that were collected, highlight some key research findings based on SEBAS, discuss the main strengths and weaknesses of the data, and outline how users may obtain the data. ABOUT THE STUDY SEBAS is a nationally representative longitudinal survey of Taiwanese middle-aged and older adults. It adds the collection of biomarkers and performance assessments to the Taiwan Longitudinal Study of Aging (TLSA), a nationally representative study of adults aged 60 and over, including the institutionalized population. The TLSA began in 1989, with follow-ups approximately every 3 years; younger refresher cohorts were added in 1996 and 2003. The first wave of SEBAS, based on a sub-sample of respondents from the 1999 TLSA, was conducted in 2000. A total of 1023 respondents completed both a face-to-face home interview and, several weeks later, a hospital-based physical examination—including a 12-hour overnight urine specimen, fasting blood specimen, and measurements of blood pressure and anthropometry. A second wave of SEBAS was conducted in 2006 using a similar protocol to SEBAS 2000, but with the addition of performance assessments conducted by the interviewers at the end of the home interview. Both waves of SEBAS also included measures of health status (physical, emotional, cognitive), health behaviors, social relationships and exposure to stressors. ICPSR is the world's largest archive of digital social science data. We acquire, preserve, and distribute original social science research data. ICPSR is a partner in social science research. Here is a link to presentation slides: http://www.icpsr.umich.edu/files/videos/SEBAS062016.pdf
Views: 397 ICPSR
AHRQ’s Practice-Based Research Network Resource Center hosted a research methodology webinar on August 18, 2015. Drs. Detry, Balasubramanian and Cohen discuss the minimum standards recommended for the design, implementation and reporting of adaptive clinical trials as applied to patient-centered outcomes research (PCOR). A detailed overview of how to apply the learning evaluation approach to quality improvement assessments across multiple organizations is also discussed. Video also at: https://pbrn.ahrq.gov/events/adaptive-trial-design-and-learning-evaluation-methods-pcor-and-quality-improvement-assessment
Views: 135 AHRQ Primary Care
Computer networks can be represented by (marked) point processes communicating information between nodes. Developing methodologies for finding and understanding correlations that exist between the point processes, particularly methods that can deal with inherent non-stationarity in the data, is therefore key to characterizing normal networks and hence spotting anomalous and potentially malicious behavior. Spectral methods in the stationary setting, and more recently time-frequency methods (e.g. wavelets) in the non-stationary setting, have proven to be extremely powerful tools for analyzing underlying structure in stochastic processes, however their use in point processes is still reasonably under-developed. They particularly have great potential for revealing periodic signaling (beaconing) that is typical of malicious behavior. Furthermore, they could be implemented in an extremely fast and computationally efficient way. In this talk, I will present some recent developments in spectral and wavelet methodology for point processes and discuss how they could have use in a cyber security setting. See more at https://www.microsoft.com/en-us/research/video/spectral-and-wavelet-coherence-for-point-processes-a-tool-for-cyber/
Views: 1019 Microsoft Research
Richard Phillips Feynman (/ˈfaɪnmən/; May 11, 1918 -- February 15, 1988) was an American theoretical physicist known for his work in the path integral formulation of quantum mechanics, the theory of quantum electrodynamics, and the physics of the superfluidity of supercooled liquid helium, as well as in particle physics (he proposed the parton model). For his contributions to the development of quantum electrodynamics, Feynman, jointly with Julian Schwinger and Sin-Itiro Tomonaga, received the Nobel Prize in Physics in 1965. He developed a widely used pictorial representation scheme for the mathematical expressions governing the behavior of subatomic particles, which later became known as Feynman diagrams. During his lifetime, Feynman became one of the best-known scientists in the world. In a 1999 poll of 130 leading physicists worldwide by the British journal Physics World he was ranked as one of the ten greatest physicists of all time. He assisted in the development of the atomic bomb during World War II and became known to a wide public in the 1980s as a member of the Rogers Commission, the panel that investigated the Space Shuttle Challenger disaster. In addition to his work in theoretical physics, Feynman has been credited with pioneering the field of quantum computing, and introducing the concept of nanotechnology. He held the Richard Chace Tolman professorship in theoretical physics at the California Institute of Technology. This video is targeted to blind users. Attribution: Article text available under CC-BY-SA Creative Commons image source in video
Views: 2613 Audiopedia
7th Annual Program in Ethics and Health Conference Session Chair: I. Glenn Cohen, J.D. Assistant Professor of Law and Co-Director, Petrie-Flom Center for Health Law Policy, Biotechnology, and Bioethics, Harvard Law School Matthew D. Adler, J.D. Leon Meltzer Professor of Law, University of Pennsylvania Law School Lisa Heinzerling, J.D. Professor of Law, Georgetown University Law Center Wendy Parmet, J.D. Associate Dean for Academic Affairs and George J. and Kathleen Waters Matthews Distinguished University Professor of Law, Northeastern University School of Law I. Glenn Cohen, J.D. Assistant Professor of Law and Co-Director, Petrie-Flom Center for Health Law Policy, Biotechnology, and Bioethics, Harvard Law School This conference focuses on how decision makers and the public tend to feel more strongly obligated to assist "identified" people at risk than to assist "statistical" ones, and the implications for public policy. To illustrate, when a group of Chilean miners were stranded following a 2010 mine accident, the rescue mission garnered worldwide support and millions of dollars, but the public had not felt a similar need to invest in mine safety measures that would have saved more statistical lives. What factors trigger or explain this difference in attitude and behavior? How is it manifested when we think about global health problems, such as treatment and prevention (and "treatment as prevention") for HIV/AIDS? Does the law express such bias? Is there any ethical justification for this bias, for example, as a matter of obligation toward each and every individual? Is it, alternatively, a moral error, rooted in well-known cognitive biases?
Views: 306 Harvard University
Celebrate Grinnell College's 2015 graduates. This is the full ceremony. Watch for your favorite graduates: 0:53:58 - Granting of degrees 0:54:45 - Humanities graduates 1:12:20 - Science graduates 1:34:58 - Social Sciences graduates 1:51:29 - Independent and interdivisional majors For highlights from the weekend, see: http://www.grinnell.edu/news/congratulations-class-2015 and https://grinnellcollege.exposure.co/commencement For videos of the honorary degree speakers and President Kington's charge to the graduate, see the Commencement 2015 playlist: https://www.youtube.com/playlist?list=PLkmKqcJuZirOw8dyEcOggY0N0Q_PGz0CV
Views: 2243 Grinnell College
March 10-11, 2015 - From Genome Function to Biomedical Insight: ENCODE and Beyond More: http://www.genome.gov/27560819
Views: 735 National Human Genome Research Institute
Y Cyfarfod Llawn yw cyfarfod o'r Cynulliad cyfan, a gynhelir yn Siambr drafod y Senedd. Y Llywydd sy’n cadeirio’r Cyfarfod Llawn a dyma’r prif fforwm i Aelodau’r Cynulliad gyflawni eu rôl fel cynrychiolwyr sydd wedi’u hethol yn ddemocrataidd. Cynhelir y Cyfarfod Llawn ddwywaith yr wythnos ar ddydd Mawrth a dydd Mercher ac mae’n agored i’r cyhoedd neu ar gael i wylio yn fyw, neu ar alw yma ar YouTube neu ar wefan Senedd TV.
Views: 106 AssemblyCynulliad
Fred Oswald, Senior Associate Editor of the Journal of Managment, discusses best editorial practices. This presentation is recorded as part of the University of Florida Warrington College of Business' Reliable Research in Business initiative. To watch more videos about reliable research practices, please sign up here: https://warrington.ufl.edu/reliable-research-in-business/best-practices-for-reliable-research/.
Views: 23 UFWarrington
This is an audio version of the Wikipedia Article: Wikipedia 00:01:52 1 History 00:02:00 1.1 Nupedia 00:03:10 1.2 Launch and early growth 00:07:04 1.3 Milestones 00:09:55 2 Openness 00:10:34 2.1 Restrictions 00:11:57 2.2 Review of changes 00:13:00 2.3 Vandalism 00:14:58 3 spanPolicies and laws 00:15:52 3.1 Content policies and guidelines 00:17:13 4 Governance 00:17:51 4.1 Administrators 00:18:51 4.2 Dispute resolution 00:19:31 4.2.1 Arbitration Committee 00:20:55 5 Community 00:23:58 5.1 Studies 00:25:28 5.2 Diversity 00:26:26 6 Language editions 00:30:09 6.1 English Wikipedia editor decline 00:32:16 7 Reception 00:33:41 7.1 Accuracy of content 00:38:10 7.2 Discouragement in education 00:39:27 7.2.1 Medical information 00:41:00 7.3 Quality of writing 00:44:05 7.4 Coverage of topics and systemic bias 00:46:07 7.4.1 Coverage of topics and selection bias 00:46:55 7.4.2 Systemic bias 00:49:31 7.5 Explicit content 00:52:28 7.6 Privacy 00:53:43 7.7 Sexism 00:54:21 8 Operation 00:54:30 8.1 Wikimedia Foundation and Wikimedia movement affiliates 00:57:24 8.2 Software operations and support 00:59:25 8.3 Automated editing 01:00:39 8.4 Hardware operations and support 01:02:00 8.5 Internal research and operational development 01:03:36 8.6 Internal news publications 01:04:40 9 Access to content 01:04:49 9.1 Content licensing 01:07:20 9.2 Methods of access 01:10:43 9.2.1 Mobile access 01:14:56 10 Cultural impact 01:15:05 10.1 Trusted source to combat fake news 01:15:43 10.2 Readership 01:17:21 10.3 Cultural significance 01:22:05 10.3.1 Awards 01:23:46 10.3.2 Satire 01:27:15 10.4 Sister projects – Wikimedia 01:28:13 10.5 Publishing 01:29:46 10.6 Research use 01:30:57 11 Related projects 01:32:48 12 See also Listening is a more natural way of learning, when compared to reading. Written language only began at around 3200 BC, but spoken language has existed long ago. Learning by listening is a great way to: - increases imagination and understanding - improves your listening skills - improves your own spoken accent - learn while on the move - reduce eye strain Now learn the vast amount of general knowledge available on Wikipedia through audio (audio article). You could even learn subconsciously by playing the audio while you are sleeping! If you are planning to listen a lot, you could try using a bone conduction headphone, or a standard speaker instead of an earphone. You can find other Wikipedia audio articles too at: https://www.youtube.com/channel/UCuKfABj2eGyjH3ntPxp4YeQ You can upload your own Wikipedia articles through: https://github.com/nodef/wikipedia-tts "The only true wisdom is in knowing you know nothing." - Socrates SUMMARY ======= Wikipedia ( (listen), (listen) WIK-ih-PEE-dee-ə) is a multilingual, web-based, free encyclopedia based on a model of openly editable and viewable content, a wiki. It is the largest and most popular general reference work on the World Wide Web, and is one of the most popular websites by Alexa rank. It is owned and supported by the Wikimedia Foundation, a non-profit organization that operates on money it receives from donors.Wikipedia was launched on January 15, 2001, by Jimmy Wales and Larry Sanger. Sanger coined its name, as a portmanteau of wiki and "encyclopedia". Initially an English-language encyclopedia, versions in other languages were quickly developed. With 5,769,083 articles, the English Wikipedia is the largest of the more than 290 Wikipedia encyclopedias. Overall, Wikipedia comprises more than 40 million articles in 301 different languages and by February 2014 it had reached 18 billion page views and nearly 500 million unique visitors per month.In 2005, Nature published a peer review comparing 42 science articles from Encyclopædia Britannica and Wikipedia and found that Wikipedia's level of accuracy approached that of Britannica. Time magazine stated that the open-door policy of allowing anyone to edit had made Wikipedia the biggest and possibly the best encyclopedia in the world, and was a testament to the vision of Jimmy Wales.Wikipedia has been criticized for exhibiting systemic bias, for presenting a mixture of "truths, half truths, and some falsehoods", and for being subject to manipulation and spin in controversial topics. In 2017, Facebook announced that it would help readers detect fake news by suitable links to Wikipedia articles. YouTube announced a similar plan in 2018.
Views: 47 wikipedia tts
This is an audio version of the Wikipedia Article: Richard P. Feynman Listening is a more natural way of learning, when compared to reading. Written language only began at around 3200 BC, but spoken language has existed long ago. Learning by listening is a great way to: - increases imagination and understanding - improves your listening skills - improves your own spoken accent - learn while on the move - reduce eye strain Now learn the vast amount of general knowledge available on Wikipedia through audio (audio article). You could even learn subconsciously by playing the audio while you are sleeping! If you are planning to listen a lot, you could try using a bone conduction headphone, or a standard speaker instead of an earphone. You can find other Wikipedia audio articles too at: https://www.youtube.com/channel/UCuKfABj2eGyjH3ntPxp4YeQ You can upload your own Wikipedia articles through: https://github.com/nodef/wikipedia-tts "The only true wisdom is in knowing you know nothing." - Socrates SUMMARY ======= Richard Phillips Feynman (; May 11, 1918 – February 15, 1988) was an American theoretical physicist, known for his work in the path integral formulation of quantum mechanics, the theory of quantum electrodynamics, and the physics of the superfluidity of supercooled liquid helium, as well as in particle physics for which he proposed the parton model. For his contributions to the development of quantum electrodynamics, Feynman, jointly with Julian Schwinger and Shin'ichirō Tomonaga, received the Nobel Prize in Physics in 1965. Feynman developed a widely used pictorial representation scheme for the mathematical expressions describing the behavior of subatomic particles, which later became known as Feynman diagrams. During his lifetime, Feynman became one of the best-known scientists in the world. In a 1999 poll of 130 leading physicists worldwide by the British journal Physics World he was ranked as one of the ten greatest physicists of all time.He assisted in the development of the atomic bomb during World War II and became known to a wide public in the 1980s as a member of the Rogers Commission, the panel that investigated the Space Shuttle Challenger disaster. Along with his work in theoretical physics, Feynman has been credited with pioneering the field of quantum computing and introducing the concept of nanotechnology. He held the Richard C. Tolman professorship in theoretical physics at the California Institute of Technology. Feynman was a keen popularizer of physics through both books and lectures including a 1959 talk on top-down nanotechnology called There's Plenty of Room at the Bottom and the three-volume publication of his undergraduate lectures, The Feynman Lectures on Physics. Feynman also became known through his semi-autobiographical books Surely You're Joking, Mr. Feynman! and What Do You Care What Other People Think? and books written about him such as Tuva or Bust! by Ralph Leighton and the biography Genius: The Life and Science of Richard Feynman by James Gleick.
Views: 33 wikipedia tts
This is an audio version of the Wikipedia Article: Edward Teller Listening is a more natural way of learning, when compared to reading. Written language only began at around 3200 BC, but spoken language has existed long ago. Learning by listening is a great way to: - increases imagination and understanding - improves your listening skills - improves your own spoken accent - learn while on the move - reduce eye strain Now learn the vast amount of general knowledge available on Wikipedia through audio (audio article). You could even learn subconsciously by playing the audio while you are sleeping! If you are planning to listen a lot, you could try using a bone conduction headphone, or a standard speaker instead of an earphone. You can find other Wikipedia audio articles too at: https://www.youtube.com/channel/UCuKfABj2eGyjH3ntPxp4YeQ You can upload your own Wikipedia articles through: https://github.com/nodef/wikipedia-tts "The only true wisdom is in knowing you know nothing." - Socrates SUMMARY ======= Edward Teller (Hungarian: Teller Ede; January 15, 1908 – September 9, 2003) was a Hungarian-American theoretical physicist who is known colloquially as "the father of the hydrogen bomb", although he did not care for the title. He made numerous contributions to nuclear and molecular physics, spectroscopy (in particular the Jahn–Teller and Renner–Teller effects), and surface physics. His extension of Enrico Fermi's theory of beta decay, in the form of Gamow–Teller transitions, provided an important stepping stone in its application, while the Jahn–Teller effect and the Brunauer–Emmett–Teller (BET) theory have retained their original formulation and are still mainstays in physics and chemistry. Teller also made contributions to Thomas–Fermi theory, the precursor of density functional theory, a standard modern tool in the quantum mechanical treatment of complex molecules. In 1953, along with Nicholas Metropolis, Arianna Rosenbluth, Marshall Rosenbluth, and Augusta Teller, Teller co-authored a paper that is a standard starting point for the applications of the Monte Carlo method to statistical mechanics. Throughout his life, Teller was known both for his scientific ability and for his difficult interpersonal relations and volatile personality. Teller was born in Hungary and emigrated to the United States in the 1930s. He was an early member of the Manhattan Project, charged with developing the first atomic bomb; during this time he made a serious push to develop the first fusion-based weapons as well, but these were deferred until after World War II. After his controversial testimony in the security clearance hearing of his former Los Alamos Laboratory superior, J. Robert Oppenheimer, Teller was ostracized by much of the scientific community. He continued to find support from the U.S. government and military research establishment, particularly for his advocacy for nuclear energy development, a strong nuclear arsenal, and a vigorous nuclear testing program. He was a co-founder of Lawrence Livermore National Laboratory (LLNL), and was both its director and associate director for many years. In his later years, Teller became especially known for his advocacy of controversial technological solutions to both military and civilian problems, including a plan to excavate an artificial harbor in Alaska using thermonuclear explosive in what was called Project Chariot. He was a vigorous advocate of Ronald Reagan's Strategic Defense Initiative.
Views: 22 wikipedia tts