#Iron Horse Middle School


Unstable Spiral (In Development)

Unstable Spiral is a legacy project dedicated to Howard Blumenfeld, author of ๐˜”๐˜ฆ๐˜ฏ๐˜ต๐˜ข๐˜ญ ๐˜ˆ๐˜ณ๐˜ค๐˜ฉ๐˜ช๐˜ต๐˜ฆ๐˜ค๐˜ต๐˜ถ๐˜ณ๐˜ฆ and Mathematics professor at Laspositas college. Unstable Spiral was a website that he worked on in college and for his birthday I'm recreating it with modern taste. I'm using bootstrap for the styling and PHP for the back end. The websites homepage is almost done and the rest of the website is in development. Go on the link below to see the website in development.

@AdamBlumenfeld The link to the website is https://unstable-spiral--snakebiking49.repl.co/ IN DEVELOPMENT

Linear Regression Using Gradient Decent

This is ๐“›๐“ฒ๐“ท๐“ฎ๐“ช๐“ป ๐“ก๐“ฎ๐“ฐ๐“ป๐“ฎ๐“ผ๐“ผ๐“ฒ๐“ธ๐“ท ๐“ค๐“ผ๐“ฒ๐“ท๐“ฐ ๐“–๐“ป๐“ช๐“ญ๐“ฒ๐“ฎ๐“ท๐“ฝ ๐““๐“ฎ๐“ฌ๐“ฎ๐“ท๐“ฝ optimized to 50 milliseconds (average of 12).

""" @nexclap/AdamBlumenfeld """ # Imports import numpy as np from random import randint from matplotlib import pyplot as plt # Define Style Of Matplotlib Graphs plt.style.use("ggplot") # Define data X = np.array([1, 3, 5, 6, 8, 10, 11, 18, 19, 20, 24, 26, 30, 32, 36, 38, 39, 40, 43, 46, 52, 55, 56, 58, 59]) y = np.array([3, 4, 5, 7, 8, 9, 10, 12, 14, 15, 21, 36, 37, 38, 39, 40, 43, 46, 49, 51, 54, 56, 58, 60, 69]) # Plot data plt.scatter(X, y) plt.show() #Regressor Class class Regressor: # Training Function def fit(self, X, y, learning_rate=0.00001, converge=0.001, cst=False): # Cst is weather or not to make a history of cost for further analysis self.cst_b = cst if cst: self.cst = [[], []] # Dataset self.X = X self.y = y # Learning rate, or "a" in the gradient decent formula self.learning_rate = learning_rate # The M and B values in the hypothysis function self.theta = [0, 0] # Cost, which initialtes at infinity self.cost = float('inf') # The iterator of the gradient decent algorithm, mine is recursive (Lol, I just had to add that flex) self.gradient_decent_step(converge) # isub for theta, basically saying theta -= (whatever), only for practical reasons, I had to make it a seprete function def theta_isub(self, i, other): self.theta[i] -= other return self.theta[i] # Calculate and update (or store if cst is True) cost def _cost(self, iteration=None): # Cost function self.cost = (1/(2*len(X))*sum([(self.h(X[index]) - y[index])*X[index] for index in range(len(X))])**2) if self.cst_b: # Update cst self.cst[0].append(self.cost) self.cst[1].append(iteration) # Hypothesis function def h(self, x): # h_ฮธ(x) = ฮธโ‚ + ฮธโ‚€x (Yes, I know that in my hypothysis function is switched around) return x*self.theta[0] + self.theta[1] # Gradient decent iterator def gradient_decent_step(self, converge, iteration=1): # Base case: if the cost is less than the set convergence point than accept current theata values if self.cost <= converge: return None # Do one iteration of gradient decent self._step() # Compute cost self._cost(iteration) return self.gradient_decent_step(converge, iteration+1) # All the math of gradient decent, (Now you know why I made the theta_isub function) def _step(self): return [self.theta_isub(0, self.learning_rate * (1/len(X)*sum([(self.h(X[index]) - y[index])*X[index] for index in range(len(X))]))),self.theta_isub(1, self.learning_rate * (1/len(X)*sum([self.h(X[index]) - y[index] for index in range(len(X))])))] # Define a model model = Regressor() # Train model (With cst = True for graphing) model.fit(X, y, cst=True) # Get the theta (M and B values) and the cst variable (or history of cost to iterations) theta = model.theta cst = model.cst # Nerd plot stuff (Plot linear regression graph) x = np.linspace(0,60,100) y1 = theta[0]*x+theta[1] plt.title("Linear Regression") plt.scatter(X, y, c='teal') plt.plot(x, y1) #plt.savefig("linear_regression.png") (Saves graph to file) plt.show() # More nerd plot stuf (Plot cost graph (cst)) plt.title("Cost") plt.plot(cst[1], cst[0]) #plt.savefig("cost.png") (Saves graph to file) plt.show()

Challenge: Program A Graphing Calculator: Prize Money!

๐‚๐ก๐š๐ฅ๐ฅ๐ž๐ง๐ ๐ž ๐ƒ๐ž๐ฌ๐œ๐ซ๐ข๐ฉ๐ญ๐ข๐จ๐ง: Create a program that simulates a graphing calculator with numerical, graphical, and simple algebraic functions. Submit your result's to Nexclap with the post tag ๐—ด๐—ฟ๐—ฎ๐—ฝ๐—ต๐—ถ๐—ป๐—ด๐—ฐ๐—ฎ๐—น๐—ฐ๐˜‚๐—น๐—ฎ๐˜๐—ผ๐—ฟ๐—ฐ๐—ต๐—ฎ๐—น๐—น๐—ฎ๐—ป๐—ด๐—ฒ. Be sure to include instructions on what platform and/or language the program is in. I will periodically update the state of the challenge and will post the final results on Nexclap on ๐Ÿ–/๐Ÿ–/๐Ÿ๐ŸŽ๐Ÿ๐Ÿ—. ๐‚๐ก๐š๐ฅ๐ฅ๐ž๐ง๐ ๐ž ๐ซ๐ฎ๐ฅ๐ž๐ฌ: ๐—ก๐—ผ ๐—•๐˜‚๐—ถ๐—น๐˜-๐—ถ๐—ป ๐—ฝ๐—ฎ๐—ฐ๐—ธ๐—ฎ๐—ด๐—ฒ๐˜€ ๐—ฎ๐—น๐—น๐—ผ๐˜„๐—ฒ๐—ฑ! The program must have an arithmetic function and must include these mathematical functions: Sine: (Such as: sin(3)) Cosine: (Such as: cos(4)) Tangent: (Such as tan(89)) Square root: (Such as โˆš4) Nth root: (Such as โˆ›8) Square of x: (Such as xยฒ) Nth exponent of x: (Such as xยณ) Order of operations: (Must obey order of operations) Parenthetical support: (Such as: (1+3)/2) Graphing equations: (Such as: xยณ + 2xยฒ - 1 or y = โˆšx) ๐‚๐ก๐š๐ฅ๐ฅ๐ž๐ง๐ ๐ž ๐ญ๐ข๐ฉ๐ฌ: A good model for your graphing calculator would be the ti-83 graphing calculator by Texas Instruments. ๐‘ญ๐’†๐’†๐’ ๐’‡๐’“๐’†๐’† ๐’•๐’ ๐’‘๐’๐’”๐’• ๐’š๐’๐’–๐’“ ๐’’๐’–๐’†๐’”๐’•๐’Š๐’๐’๐’” ๐’๐’ ๐’•๐’‰๐’Š๐’” ๐’„๐’‰๐’‚๐’๐’๐’†๐’๐’ˆ๐’† ๐’Š๐’ ๐’•๐’‰๐’† ๐’„๐’๐’Ž๐’Ž๐’†๐’๐’•๐’”! ๐“‘๐“ฎ๐“ผ๐“ฝ ๐“ž๐“ฏ ๐“›๐“พ๐“ฌ๐“ด! - เธ„เน”เธ„เน“ PRIZE MONEY! 0 dollars! It's a free challange! You really expect me to give you prize money!!!

/* @nexclap/AdamBlumenfeld */ The Deadline Of This Challenge: 8/8/2019! ๏ผง๏ฝ๏ฝ๏ฝ„ ๏ผฌ๏ฝ•๏ฝƒ๏ฝ‹
Adam Blumenfeld Jul 08

I will periodically update the status of this challenge

Akhil Yeleswar Jul 08

Awesome. Should add a rule - no built in packages allowed

K Means Clustering In Infinite Dimensions (Without Sklearn)

This is K Means Clustering in infinite dimensions finally complete! K Means Clustering follows under the category of clustering (hence the name). Clustering falls under another subset of machine learning called unsupervised machine learning, which is doing machine learning on datasets without labels. I have provided a 2d dataset and a 4d dataset to test my model. Both of which worked. Sometimes you will get a Zero Division Error, if that occurs, just run the code until it works. To learn more about K Means Clustering, go to the following tutorial: https://www.youtube.com/watch?v=4b5d3muPQmA&t=296s

""" K Means Clustering In Infinite Dimensions By Adam Blumenfeld @nexclap/AdamBlumenfeld """ # Imports from random import randint as random from matplotlib import pyplot as plt from matplotlib import style from math import sqrt import numpy as np # Set style of graphs style.use("ggplot") # Dataset data=[(1,1), (1,2), (2,1),(2,3),(2,4),(3,3),(7,9),(7,6),(8,5),(8,9),(9,6),(9,9), (2, 8), (1, 9), (1,7), (3, 7), (4, 9)] # Multidimensional dataset _data=[(1,1, 2, 3), (1,2, 2, 4), (2,1, 2, 3),(2,3, 2, 4),(2,4, 3, 2),(3,3, 2, 1),(7,9, 8, 9),(7,6, 9, 8),(8,5, 8, 5),(8,9, 9, 6),(9,6, 6, 9),(9,9, 9, 5), (2, 8, 4, 5), (1, 9, 4, 6), (1,7, 4, 4), (3, 7, 3, 4), (4, 9, 5, 5)] # Plot dataset plt.scatter([point[0] for point in data], [point[1] for point in data]) plt.show() class KMeansClustering: # Helper function def fit(self, data, k=3, ndim=2, tolerance=0.001): self.k = k self.data = data self.ndim = ndim. self._centroids() self.cluster() self._fit(dev=self.std(), best=self.clusters) return self.clusters # Recursive function def _fit(self, dev=float('inf'), tolerance=0.001, best=None): self._centroids() self.cluster() if self.std() <= dev: if dev - self.std() <= tolerance: return best return self._fit(dev, tolerance, best) self._fit(dev, tolerance, best) # Recursive function for finding one cluster def cluster(self, m=None): self.assign() self.reassign_centroids() if self.clusters == m: return self.clusters return self.cluster(self.clusters) # Euclidean distance def euiclid(self, p1, p2): return sqrt(sum([(p2[i] - p1[i])**2 for i in range(len(p1))])) # Mean def mean(self, data): return [(sum([point[i] for point in data]) / len(data)) for i in range(self.ndim)] # Initialize first centroids: def _centroids(self): self.centroids = [[random(0, max(self.data)[i]) for i in range(self.ndim)] for z in range(self.k)] # Reassign centroids def reassign_centroids(self): self.centroids = [self.mean(self.clusters[i]) for i in range(len(self.clusters))] # Assign clusters: def assign(self): self.clusters = {i: [] for i in range(len(self.centroids))} for point in self.data: distances = [self.euiclid(centroid, point) for centroid in self.centroids] for i in range(len(self.centroids)): if min(distances) == distances[i]: self.clusters[i].append(point) # Standard deviation function def std(self): return sqrt(sum([sum([self.euiclid(self.centroids[key], point) for point in cluster]) for key, cluster in self.clusters.items()])) model = KMeansClustering() # Fit on two dimensional dataset model.fit(data, ndim=2) # Fit on four dimensional dataset # model.fit(_data, ndim=4)
1