Featured Discussions - AnalyticBridge2019-12-12T15:39:41Zhttps://www.analyticbridge.datasciencecentral.com/forum/topic/list?feed=yes&xn_auth=no&featured=1Can Python do the following?tag:www.analyticbridge.datasciencecentral.com,2018-11-13:2004291:Topic:3894022018-11-13T18:02:59.085ZVincent Granvillehttps://www.analyticbridge.datasciencecentral.com/profile/VincentGranville
<p>These were features that I liked in Perl. Wondering if there is a way to make it work with Python?</p>
<ul>
<li>Automated memory allocation / de-allocation (for variables, arrays, hash tables etc.)</li>
<li>Turning your program into an executable (that is, pre-compiled.)</li>
<li>Automated variable initialization (variables, arrays don't even need to be declared, much less initialized)</li>
<li>Automated type casting (e.g. automatically treating a same variable as an integer or string…</li>
</ul>
<p>These were features that I liked in Perl. Wondering if there is a way to make it work with Python?</p>
<ul>
<li>Automated memory allocation / de-allocation (for variables, arrays, hash tables etc.)</li>
<li>Turning your program into an executable (that is, pre-compiled.)</li>
<li>Automated variable initialization (variables, arrays don't even need to be declared, much less initialized)</li>
<li>Automated type casting (e.g. automatically treating a same variable as an integer or string depending on the context: integer when performing a multiplication, or string for concatenation)</li>
</ul>
<p><a href="https://storage.ning.com/topology/rest/1.0/file/get/134946461?profile=original" target="_blank" rel="noopener"><img src="https://storage.ning.com/topology/rest/1.0/file/get/134946461?profile=original" class="align-center"/></a></p>
<p>You are going to say that this makes for terrible programming, but in my case I use the code only for myself, and I'd rather focus on the algorithms rather than the coding / debugging itself. Also, wondering if there are options for automated de-bugging.</p>
<p>Also wondering how to produce sounds in Python, and which random number generator it uses. Finally, is high precision computing (like 500 digits of accuracy) is reliable in Python, using the default BigNum libraries?</p>
<p>Thanks.</p>
<p></p> Explaining SQL JOINS - Improving on the Classic Venn Diagramstag:www.analyticbridge.datasciencecentral.com,2018-10-16:2004291:Topic:3892312018-10-16T13:32:32.727ZTim Millerhttps://www.analyticbridge.datasciencecentral.com/profile/TimMiller
<p class="yklcuq-10 hpxQMr">Back when I was learning SQL, I was often hung up on the JOIN concept. The venn diagrams were a life saver but as I learned more and used SQL more and more I found that they were not quite enough.</p>
<p class="yklcuq-10 hpxQMr"></p>
<p class="yklcuq-10 hpxQMr">I worked with some of my colleagues at the Data School to try to go a little bit further.</p>
<p class="yklcuq-10 hpxQMr"></p>
<p class="yklcuq-10 hpxQMr">We were trying to keep it VERY basic so some join…</p>
<p class="yklcuq-10 hpxQMr">Back when I was learning SQL, I was often hung up on the JOIN concept. The venn diagrams were a life saver but as I learned more and used SQL more and more I found that they were not quite enough.</p>
<p class="yklcuq-10 hpxQMr"></p>
<p class="yklcuq-10 hpxQMr">I worked with some of my colleagues at the Data School to try to go a little bit further.</p>
<p class="yklcuq-10 hpxQMr"></p>
<p class="yklcuq-10 hpxQMr">We were trying to keep it VERY basic so some join types and antijoins are not included.</p>
<p class="yklcuq-10 hpxQMr"></p>
<p class="yklcuq-10 hpxQMr">I would love to know what you all think.</p>
<p class="yklcuq-10 hpxQMr"><a href="http://storage.ning.com/topology/rest/1.0/file/get/2059721635?profile=original" target="_self"><img src="http://storage.ning.com/topology/rest/1.0/file/get/2059721635?profile=original" width="470" class="align-center"/></a></p>
<p class="yklcuq-10 hpxQMr"></p>
<p class="yklcuq-10 hpxQMr">For the full write up:<span> </span><a target="_blank" class="yklcuq-27 dlFmxw" href="https://dataschool.com/sql-join-types-explained-visualizing-sql-joins-and-building-on-the-classic-venn-diagrams/" rel="noopener">https://dataschool.com/sql-join-types-explained-visualizing-sql-joins-and-building-on-the-classic-venn-diagrams/</a></p> Anomaly Detectiontag:www.analyticbridge.datasciencecentral.com,2018-07-06:2004291:Topic:3866992018-07-06T13:40:18.124ZPaterno III Cobradorhttps://www.analyticbridge.datasciencecentral.com/profile/PaternoIIICobrador
<p>How would you go about an Unsupervised Anomaly Detection problem?</p>
<p>How would you go about an Unsupervised Anomaly Detection problem?</p> Examples of Stochastic Processes in Machine Learningtag:www.analyticbridge.datasciencecentral.com,2018-06-11:2004291:Topic:3853392018-06-11T13:03:55.271ZJacob Weisshttps://www.analyticbridge.datasciencecentral.com/profile/JacobWeiss
<p>Hi All,</p>
<p>I am currently taking graduate-level coursework in Stochastic Processes. My hope is to apply Stochastic Processes in Machine Learning. I have just started to think about uses cases, and one particular use case that stands out is having the machine learn which probability distribution to pick from when given a data set, then create "X" amount of random processes. </p>
<p>As an amateur, I was wondering if anyone else has tried to use stochastic processes in machine learning.…</p>
<p>Hi All,</p>
<p>I am currently taking graduate-level coursework in Stochastic Processes. My hope is to apply Stochastic Processes in Machine Learning. I have just started to think about uses cases, and one particular use case that stands out is having the machine learn which probability distribution to pick from when given a data set, then create "X" amount of random processes. </p>
<p>As an amateur, I was wondering if anyone else has tried to use stochastic processes in machine learning. What use cases did you apply to? What were some best practices you can share?</p>
<p>Thanks,</p>
<p>Jacob</p> Question about Some Statistical Distributions (Updated)tag:www.analyticbridge.datasciencecentral.com,2018-02-12:2004291:Topic:3804142018-02-12T17:46:44.817ZVincent Granvillehttps://www.analyticbridge.datasciencecentral.com/profile/VincentGranville
<p>What are the potential distributions for a continuous variable <em>X</em> on [0, 1], if |2<em>X</em> - 1| is known to have a uniform distribution on [0, 1]? Will the distribution of INT(2<em>X</em>) always be uniform on {0, 1} ?</p>
<p>This question arises in a potential proof that the digits of the number Pi in base 2 (see exercise 7 <a href="https://www.datasciencecentral.com/profiles/blogs/are-the-digits-of-pi-truly-random" rel="noopener" target="_blank">in this article</a>), distributed…</p>
<p>What are the potential distributions for a continuous variable <em>X</em> on [0, 1], if |2<em>X</em> - 1| is known to have a uniform distribution on [0, 1]? Will the distribution of INT(2<em>X</em>) always be uniform on {0, 1} ?</p>
<p>This question arises in a potential proof that the digits of the number Pi in base 2 (see exercise 7 <a href="https://www.datasciencecentral.com/profiles/blogs/are-the-digits-of-pi-truly-random" target="_blank" rel="noopener">in this article</a>), distributed as INT(2<em>X</em>) and obviously being equal to 0 or 1, are uniformly distributed (50% of 0's and 50% of 1's.) </p>
<p></p>
<p><a href="http://storage.ning.com/topology/rest/1.0/file/get/2059721720?profile=original" target="_self"><img src="http://storage.ning.com/topology/rest/1.0/file/get/2059721720?profile=original" width="271" class="align-center"/></a></p>
<p><strong>Update</strong></p>
<p>I spent more time on this problem, and it is not an easy one. There are actually infinitely many solutions, as many as there are real numbers on [0, 1]. The vast majority of these distributions are nowhere continuous -- they don't have a density. To understand this, do the following simulation:</p>
<ul>
<li>Simulate <em>n</em> random deviates <em>u</em>(<em>n</em>) uniformly distributed on [0, 1].</li>
<li>Generate <em>n</em> numbers <em>d</em>(<em>n</em>) distributed on {-1, +1}. They don't need to be uniformly distributed: they can all be -1 or +1 or any combination of both. For instance <em>d</em>(<em>n</em>) can be -1 if the <em>n</em>-th digit of Pi in base 2, is zero, and +1 if the <em>n</em>-th digit of Pi in base 2, is one. You can use any other number instead of Pi, for instance 7/13, and then the final result will be different.</li>
<li>For each <em>n</em>, compute <em>v</em>(<em>n</em>) = <em>d</em>(<em>n</em>) * <em>u</em>(<em>n</em>).</li>
<li>For each <em>n</em>, compute <em>x</em>(<em>n</em>) = (1 + <em>v</em>(<em>n</em>)) / 2.</li>
</ul>
<p>The limiting random variable <em>X</em> attached to the <em>x</em>(<em>n</em>)'s, as <em>n</em> tends to infinity, is solution to the problem. However, there are as many solutions as there are ways to generate the <em>d</em>(<em>n</em>)'s, and the distribution of INT(2<em>X</em>) will be discrete on {0, 1}, but usually not uniform: it will depend on the proportions of +1 and -1 in the <em>d</em>(<em>n</em>)'s. If you use the number Pi to compute the <em>d</em>(<em>n</em>)'s, it will be uniform.</p> Generalized Coefficient of Correlation for Non-Linear Relationshipstag:www.analyticbridge.datasciencecentral.com,2018-02-12:2004291:Topic:3804122018-02-12T17:24:16.144ZVincent Granvillehttps://www.analyticbridge.datasciencecentral.com/profile/VincentGranville
<p>What is the best correlation coefficient R(<em>X</em>, <em>Y</em>) to measure non-linear dependencies between two variables <em>X</em> and <em>Y</em>? Let's say that you want to assess weather there is a linear or quadratic relationship between <em>X</em> and <em>Y</em>. One way to do it is to perform a polynomial regression such as <em>Y</em> = <em>a</em> + <em>bX</em> + <em>cX</em>^2, and then measure the standard coefficient of correlation between the predicted and observed values. How…</p>
<p>What is the best correlation coefficient R(<em>X</em>, <em>Y</em>) to measure non-linear dependencies between two variables <em>X</em> and <em>Y</em>? Let's say that you want to assess weather there is a linear or quadratic relationship between <em>X</em> and <em>Y</em>. One way to do it is to perform a polynomial regression such as <em>Y</em> = <em>a</em> + <em>bX</em> + <em>cX</em>^2, and then measure the standard coefficient of correlation between the predicted and observed values. How good is this approach? </p>
<p><a href="http://storage.ning.com/topology/rest/1.0/file/get/2059721641?profile=original" target="_self"><img src="http://storage.ning.com/topology/rest/1.0/file/get/2059721641?profile=original" width="430" class="align-center"/></a></p>
<p>Note that the proposed correlation coefficient R(<em>X</em>, <em>Y</em>) is not symmetric. One way to get a symmetric version, is to use the maximum between | R(<em>X</em>, <em>Y</em>) | and | R(<em>Y</em>, <em>X</em>) |. It will be equal to 1 if and only if there is an exact polynomial or inverse polynomial relationship between <em>X</em> and <em>Y</em>. </p>
<p><strong>Note</strong>: If one checks the model <em>Y</em> = <em>a</em> + <em>b</em>X + <em>c</em>X^2, the "inverse polynomial" model would be <em>X</em> = <em>a'</em> + <em>b'Y</em> + <em>c'Y</em>^2. So, R(<em>X</em>, <em>Y</em>) is computed on the first regression, while R(<em>Y</em>, <em>X</em>) is computed on the second (reversed, also called dual) regression. </p>
<p><strong>Discussion</strong></p>
<p>An issue with my approach is the risk of over-fitting. If you have <em>n</em> observations and <em>n</em> coefficients in the regression, my correlation will always be 1.</p>
<p>There are various ways to avoid this problem, for instance:</p>
<ul>
<li>Use a polynomial of degree 2 maximum, regardless of the number of observations.</li>
<li>Use much smoother functions than polnomials, for instance functions that have one extremum (maximum or minimum) at most, and growing not faster than a linear function. Even in that case, use a small number of coefficients in the regression, maybe log(log(<em>n</em>))) where <em>n</em> is the number of observations.</li>
</ul>
<p>The correlation coefficient in question can also be used for model selection: The best model would provide the correlation closest to 1.</p> My paper on Differential Geometry on Graphs with applications to Foreign Exchange Option Symmetrytag:www.analyticbridge.datasciencecentral.com,2017-10-02:2004291:Topic:3721022017-10-02T20:55:19.864ZValery A. Kholodnyihttps://www.analyticbridge.datasciencecentral.com/profile/ValeryAKholodnyi
<p>Dear Colleagues,<br></br> <br></br> I would like to bring to your attention my paper of 2001 on differential geometry on graphs with applications to the foreign exchange option symmetry in a multiple currency foreign exchange market that might be of interest to you:</p>
<p> </p>
<p>[1] V.A. Kholodnyi and J.F. Price, <i>Foreign Exchange Option Symmetry and a Coordinate-Free Description of a Multiple Currency Market in Terms of Differential Geometry on Graphs,</i> Journal of Nonlinear Analysis, 47 (9)…</p>
<p>Dear Colleagues,<br/> <br/> I would like to bring to your attention my paper of 2001 on differential geometry on graphs with applications to the foreign exchange option symmetry in a multiple currency foreign exchange market that might be of interest to you:</p>
<p> </p>
<p>[1] V.A. Kholodnyi and J.F. Price, <i>Foreign Exchange Option Symmetry and a Coordinate-Free Description of a Multiple Currency Market in Terms of Differential Geometry on Graphs,</i> Journal of Nonlinear Analysis, 47 (9) (2001) 5885-5896.</p>
<p> </p>
<p>I introduced the notion of the differential geometry on graphs in 1995 in the following preprint:</p>
<p> </p>
<p>[2] V.A. Kholodnyi, <i>Beliefs-Preferences Gauge Symmetry Group and Dynamic Replication of Contingent Claims in a General Market Environment</i>, IES Preprint, 1995.</p>
<p> </p>
<p>The foreign exchange symmetry was introduced in 1996 in the following preprints:</p>
<p> </p>
<p>[3] V.A. Kholodnyi and J.F. Price, <i>Foreign Exchange Option Symmetry in a General Market Environment</i>, IES Preprint, 1996.</p>
<p> </p>
<p>[4] V.A. Kholodnyi and J.F. Price, <i>Foreign Exchange Option Symmetry in a Multiple Currency General Market Environment</i>, IES Preprint, 1996.</p>
<p> </p>
<p>Please also find below the references to some of my related published books and papers that might be also of interest to you:</p>
<p> </p>
<p>[5] V.A. Kholodnyi, <i>Beliefs-Preferences Gauge Symmetry Group and Replication of Contingent Claims in a General Market Environment</i>, IES Press, Research Triangle Park, North Carolina, 1998.</p>
<p> </p>
<p>[6] V.A. Kholodnyi and J.F. Price, <i>Foreign Exchange Option Symmetry</i>, World Scientific, River Edge, New Jersey, 1998.</p>
<p> </p>
<p>[7] V.A. Kholodnyi and J.F. Price, <i>Foundations of Foreign Exchange Option Symmetry</i>, IES Press, Research Triangle Park, North Carolina, 1998.</p>
<p> </p>
<p>[8] V.A. Kholodnyi, <i>Valuation and Dynamic Replication of Contingent Claims in the Framework of the Beliefs-Preferences Gauge Symmetry,</i> European Physical Journal B, 27 (2) (2002) 229-238.</p>
<p> </p>
<p>[9] V.A. Kholodnyi and J.F. Price, <i>Foreign Exchange Option Symmetry Based on Domestic-Foreign Payoff Invariance</i>, Proceedings of the IEEE/IAFE Conference on Computational Intelligence for Financial Engineering (CIFEr), New York, (1997), 164-170.</p>
<p> </p>
<p>For further information about my related books and papers please find below the link to my Profile at the ResearchGate: <a href="http://www.researchgate.net/profile/Valery_Kholodnyi">http://www.researchgate.net/profile/Valery_Kholodnyi</a>.</p>
<p><br/> Please let me know if you might have questions or would like further information.<br/> <br/> Sincerely,<br/> Valery Kholodnyi</p> Two Great Courses on Deep Learning and AItag:www.analyticbridge.datasciencecentral.com,2017-08-10:2004291:Topic:3693842017-08-10T23:03:42.725ZVincent Granvillehttps://www.analyticbridge.datasciencecentral.com/profile/VincentGranville
<p><strong>Deep Learning, Neural Networks and AI</strong></p>
<p>The course is a new one by <a href="https://www.coursera.org/instructor/andrewng" rel="noopener noreferrer" target="_blank"></a><span>Andrew Ng, Co-founder, Coursera; Adjunct Professor, Stanford University; formerly head of Baidu AI Group/Google Brain. It will start Aug 15. </span></p>
<p><span>About this course: <span>If you want to break into cutting-edge AI, this course will help you do so. Deep learning engineers are highly…</span></span></p>
<p><strong>Deep Learning, Neural Networks and AI</strong></p>
<p>The course is a new one by <a href="https://www.coursera.org/instructor/andrewng" target="_blank" rel="noopener noreferrer"></a><span>Andrew Ng, Co-founder, Coursera; Adjunct Professor, Stanford University; formerly head of Baidu AI Group/Google Brain. It will start Aug 15. </span></p>
<p><span>About this course: <span>If you want to break into cutting-edge AI, this course will help you do so. Deep learning engineers are highly sought after, and mastering deep learning will give you numerous new career opportunities. Deep learning is also a new "superpower" that will let you build AI systems that just weren't possible a few years ago.</span></span></p>
<p><span>In this course, you will learn the foundations of deep learning. When you finish this class, you will:</span></p>
<ul>
<li>Understand the major technology trends driving Deep Learning</li>
<li>Be able to build, train and apply fully connected deep neural networks</li>
<li>Know how to implement efficient (vectorized) neural networks</li>
<li>Understand the key parameters in a neural network's architecture</li>
</ul>
<p><strong>Deep Learning with TensorFlow</strong></p>
<p><span>To help make deep learning even more accessible to engineers and data scientists at large, Google has launched a free Deep Learning Course. This short, intensive course provides you with all the basic tools and vocabulary to get started with deep learning, and walks you through how to use it to address some of the most common machine learning problems. It is also accompanied by interactive TensorFlow notebooks that directly mirror and implement the concepts introduced in the lectures. </span></p>
<p><span>Links to these two courses <a href="http://www.datasciencecentral.com/profiles/blogs/two-great-courses-on-deep-learning-and-ai" target="_blank">are provided here</a>. </span></p>
<p><span><span><span class="font-size-4"><b>DSC Resources</b></span></span></span></p>
<ul>
<li>Services: <a href="http://careers.analytictalent.com/jobs/products">Hire a Data Scientist</a> | <a href="http://www.datasciencecentral.com/page/search?q=Python">Search DSC</a> | <a href="http://classifieds.datasciencecentral.com/">Classifieds</a> | <a href="http://www.analytictalent.com/">Find a Job</a></li>
<li>Contributors: <a href="http://www.datasciencecentral.com/profiles/blog/new">Post a Blog</a> | <a href="http://www.datasciencecentral.com/forum/topic/new">Ask a Question</a></li>
<li>Follow us: <a href="http://www.twitter.com/datasciencectrl">@DataScienceCtrl</a> | <a href="http://www.twitter.com/analyticbridge">@AnalyticBridge</a></li>
</ul>
<p><span>Popular Articles</span></p>
<ul>
<li><a href="http://www.datasciencecentral.com/profiles/blogs/difference-between-machine-learning-data-science-ai-deep-learning">Difference between Machine Learning, Data Science, AI, Deep Learnin...</a></li>
<li><a href="http://www.datasciencecentral.com/profiles/blogs/20-articles-about-core-data-science">What is Data Science? 24 Fundamental Articles Answering This Question</a></li>
<li><a href="http://www.datasciencecentral.com/profiles/blogs/hitchhiker-s-guide-to-data-science-machine-learning-r-python">Hitchhiker's Guide to Data Science, Machine Learning, R, Python</a></li>
<li><a href="http://www.datasciencecentral.com/profiles/blogs/advanced-machine-learning-with-basic-excel">Advanced Machine Learning with Basic Excel</a></li>
</ul> Java versus C++ (funny)tag:www.analyticbridge.datasciencecentral.com,2017-06-12:2004291:Topic:3658522017-06-12T17:19:21.008ZVincent Granvillehttps://www.analyticbridge.datasciencecentral.com/profile/VincentGranville
<p>I could not resist to post it. Note that I do not share the views expressed in this video.</p>
<p><iframe width="560" height="315" src="https://www.youtube.com/embed/pkdz5kFuLlo?wmode=opaque" frameborder="0" allowfullscreen=""></iframe>
</p>
<p>Enjoy!</p>
<p></p>
<p>I could not resist to post it. Note that I do not share the views expressed in this video.</p>
<p><iframe width="560" height="315" src="https://www.youtube.com/embed/pkdz5kFuLlo?wmode=opaque" frameborder="0" allowfullscreen=""></iframe>
</p>
<p>Enjoy!</p>
<p></p> Converting infinite series to finite series, a problem cited here in 2016, is 4000 years old.tag:www.analyticbridge.datasciencecentral.com,2017-06-11:2004291:Topic:3660432017-06-11T22:02:43.211ZMilo Gardnerhttps://www.analyticbridge.datasciencecentral.com/profile/MiloGardner
IEEE has long noted that Egyptians were first to convert base 10 to a form of binary arithmetic. The Old Kingdom seemed to round off base 10 numerals and rational numbers by throwing away units as large as 1/64. Nearby Babylonians rounded off much smaller units in its bad 60 numeration system that write 1/91 as 1/90.<br />
<br />
However, by 2050 BCE Egyotisn scribes formalized an exact numeration system as the Kahun Papyrus and Ahmes (RMP) 2/n tables report…
IEEE has long noted that Egyptians were first to convert base 10 to a form of binary arithmetic. The Old Kingdom seemed to round off base 10 numerals and rational numbers by throwing away units as large as 1/64. Nearby Babylonians rounded off much smaller units in its bad 60 numeration system that write 1/91 as 1/90.<br />
<br />
However, by 2050 BCE Egyotisn scribes formalized an exact numeration system as the Kahun Papyrus and Ahmes (RMP) 2/n tables report<br />
<br />
<a href="http://rmprectotable.blogspot.com/">http://rmprectotable.blogspot.com/</a><br />
<br />
had scaled n/p by LCM m/m to mn/mp by finding the best divisors of mn that summed to mn. Ahmes often recorded the divisors in red auxiliary numbers, before five-term or shorter unit fraction were recorde in a ciphered numeration system.<br />
<br />
Let me stop here. Does anyone wish to comment on this or deeper historical threads that show that scribes also used the modern number theiry property that division was inverse to multiplication and multiplication was inverse to division, literally?<br />
<br />
Best Regards to all.