Data Skeptic

The Data Skeptic Podcast features interviews and discussion of topics related to data science, statistics, machine learning, artificial intelligence and the like, all from the perspective of applying critical thinking and the scientific method to evaluate the veracity of claims and efficacy of approaches.

https://dataskeptic.com

Eine durchschnittliche Folge dieses Podcasts dauert 31m. Bisher sind 530 Folge(n) erschienen. Dieser Podcast erscheint wöchentlich.

Gesamtlänge aller Episoden: 11 days 3 hours 4 minutes

subscribe
share






recommended podcasts


Multi-Agent Diverse Generative Adversarial Networks


Despite the success of GANs in imaging, one of its major drawbacks is the problem of 'mode collapse,' where the generator learns to produce samples with extremely low variety. To address this issue, today's guests Arnab Ghosh and Viveka Kulharia...


share








 May 12, 2017  29m
 
 

[MINI] Generative Adversarial Networks


GANs are an unsupervised learning method involving two neural networks iteratively competing. The discriminator is a typical learning system. It attempts to develop the ability to recognize members of a certain class, such as all photos which have...


share








 May 5, 2017  9m
 
 

Opinion Polls for Presidential Elections


Recently, we've seen opinion polls come under some skepticism.  But is that skepticism truly justified?  The recent Brexit referendum and US 2016 Presidential Election are examples where some claims the polls "got it wrong".  This...


share








 April 28, 2017  52m
 
 

OpenHouse


No reliable, complete database cataloging home sales data at a transaction level is available for the average person to access. To a data scientist interesting in studying this data, our hands are complete tied. Opportunities like testing sociological...


share








 April 21, 2017  26m
 
 

[MINI] GPU CPU


There's more than one type of computer processor. The central processing unit (CPU) is typically what one means when they say "processor". GPUs were introduced to be highly optimized for doing floating point computations in parallel. These types of...


share








 April 14, 2017  11m
 
 

[MINI] Backpropagation


Backpropagation is a common algorithm for training a neural network.  It works by computing the gradient of each weight with respect to the overall error, and using stochastic gradient descent to iteratively fine tune the weights of the network....


share








 April 7, 2017  15m
 
 

Data Science at Patreon


  In this week's episode of Data Skeptic, host Kyle Polich talks with guest Maura Church, Patreon's data science manager.  is a fast-growing crowdfunding platform that allows artists and creators of all kinds build their own subscription...


share








 March 31, 2017  32m
 
 

[MINI] Feed Forward Neural Networks


Feed Forward Neural Networks In a feed forward neural network, neurons cannot form a cycle. In this episode, we explore how such a network would be able to represent three common logical operators: OR, AND, and XOR. The XOR operation is the...


share








 March 24, 2017  15m
 
 

Reinventing Sponsored Search Auctions


In this Data Skeptic episode, Kyle is joined by guest to discuss his latest efforts to mitigate the problems presented in this new world of online advertising. Working with his collaborators, Ruggiero reconsiders the search ad allocation and pricing...


share








 March 17, 2017  41m
 
 

[MINI] The Perceptron


Today's episode overviews the perceptron algorithm. This rather simple approach is characterized by a few particular features. It updates its weights after seeing every example, rather than as a batch. It uses a step function as an activation...


share








 March 10, 2017  14m