http://wiki.math.bme.hu/history/Informatics2-2020/Lab12?feed=atom&Informatics2-2020/Lab12 - Laptörténet2024-03-28T10:16:45ZAz oldal laptörténete a wikibenMediaWiki 1.18.1http://wiki.math.bme.hu/index.php?title=Informatics2-2020/Lab12&diff=14418&oldid=prevGaebor: Új oldal, tartalma: „previous up next =Exercises= ==Introduction== Some exercises to get used to numpy # Make…”2020-05-13T09:01:55Z<p>Új oldal, tartalma: „<a href="/view/Informatics2-2020/Lab11" title="Informatics2-2020/Lab11">previous</a> <a href="/view/Informatics2-2020" title="Informatics2-2020">up</a> <a href="/edit/Informatics2-2020/Lab13?redlink=1" class="new" title="Informatics2-2020/Lab13 (a lap nem létezik)">next</a> =Exercises= ==Introduction== Some exercises to get used to numpy # Make…”</p>
<p><b>Új lap</b></p><div>[[Informatics2-2020/Lab11|previous]] [[Informatics2-2020|up]] [[Informatics2-2020/Lab13|next]]<br />
<br />
=Exercises=<br />
<br />
==Introduction==<br />
Some exercises to get used to numpy<br />
# Make a vector of length 10 with elements all zero! Then modify its 4th element to 1 ''(zeros)''<br />
# Make a 3-by-3 matrix with elements ranging from 0 up to 8 ''(reshape)''<br />
# Make a random vector of length 30 containing random number between 0 to 1! Calculate its average and standard deviation! ''(rand, mean, std)''<br />
## Make a random vector of the same length with elements between -3 and 2!<br />
# Make a random unit vector in 5 dimensions! First make a random vector in 5 dimensions and then normalize it to unit length!<br />
<br />
==Monte-Carlo==<br />
Generate 500000 random points in the rectangle <math>[0,2]\times[0,4]</math>. Count how many of the points <math>(x,y)</math> have the property that <math>x^2>y</math>. Use this to approximate the integral <math>\int_0^2x^2 d x</math> Like in the end of the lecture.<br />
==Numeric integral==<br />
Estimate the integral of <math>e^{-x^2}</math> on the interval <math>[-2,5]</math> with the [https://en.wikipedia.org/wiki/Riemann_sum#Left_Riemann_sum left Riemann sum]!<br />
==Gradient descent==<br />
Let's have a vector-to-scalar function <math>f(x,y)=x^2+y^2</math>. Starting from <math>(x_0,y_0) = (-1, -1)</math> we will wind the minimum of the function.<br />
A gradient step is when you subtract the <math>\nabla f(x,y)\cdot \epsilon</math> from the <math>(x,y)</math> point.<br />
If you do this for small <math>\epsilon</math> many times then the point will converge a point where you cannot increase the function value any more, i.e. the gradient is zero.<br />
This way you can find the minimum of the function (it will be <math>(x,y) = (0, 0)</math>).<br />
* Store each step along the way, and plot them with matplotlib!<br />
<br />
== Numeric derivative==<br />
Plot the function <math>\sin(x)</math> and its derivative on the interval <math>[-\pi, \pi]</math>.<br />
Calculate the derivative with [https://en.wikipedia.org/wiki/Finite_difference finite difference] method!<br />
<br />
[[Informatics2-2020/Lab11|previous]] [[Informatics2-2020|up]] [[Informatics2-2020/Lab13|next]]</div>Gaebor