Skip to content

xiw54/DS502_Homework2

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 

Repository files navigation

DS502_Homework2

MLP Sample Output: Class 2 Multiple Layer Perceptron (MLP) Example Epoch 0: loss = 1.368944852501666, accuracy = 0.8406666666666667 Epoch 1: loss = 0.8847605154029998, accuracy = 0.924 Epoch 2: loss = 0.6223828523212359, accuracy = 0.9353333333333333 Epoch 3: loss = 0.46528025481353785, accuracy = 0.9466666666666667 Epoch 4: loss = 0.37985555506669233, accuracy = 0.9566666666666667 Epoch 5: loss = 0.3131803341034176, accuracy = 0.9646666666666667 Epoch 6: loss = 0.2703222406641372, accuracy = 0.9713333333333334 Epoch 7: loss = 0.2412709649794277, accuracy = 0.9753333333333334 Epoch 8: loss = 0.2122792520184997, accuracy = 0.9773333333333334 Epoch 9: loss = 0.19184901792076034, accuracy = 0.9773333333333334 Epoch 10: loss = 0.1739593897822734, accuracy = 0.9813333333333333 Epoch 11: loss = 0.16300189122039285, accuracy = 0.9793333333333333 Epoch 12: loss = 0.1480960513893727, accuracy = 0.9826666666666667 Epoch 13: loss = 0.14036218400400097, accuracy = 0.9833333333333333 Epoch 14: loss = 0.1304579673453462, accuracy = 0.9826666666666667 Epoch 15: loss = 0.12197091190161545, accuracy = 0.986 Epoch 16: loss = 0.11566551865542593, accuracy = 0.9866666666666667 Epoch 17: loss = 0.11024152950942553, accuracy = 0.986 Epoch 18: loss = 0.10596325143659664, accuracy = 0.9873333333333333 Epoch 19: loss = 0.09938557470148325, accuracy = 0.9893333333333333 Epoch 20: loss = 0.09492650091071682, accuracy = 0.9893333333333333 Epoch 21: loss = 0.09038558075919634, accuracy = 0.9906666666666667 Epoch 22: loss = 0.08745535480984103, accuracy = 0.9906666666666667 Epoch 23: loss = 0.08310629131413616, accuracy = 0.992 Epoch 24: loss = 0.08072813302395683, accuracy = 0.9913333333333333 Epoch 25: loss = 0.0773309045302611, accuracy = 0.9926666666666667 Epoch 26: loss = 0.0743805183050479, accuracy = 0.9933333333333333 Epoch 27: loss = 0.07191968740388602, accuracy = 0.9933333333333333 Epoch 28: loss = 0.07088971896865372, accuracy = 0.9946666666666667 Epoch 29: loss = 0.06728842472483931, accuracy = 0.994 Epoch 30: loss = 0.06510765901501725, accuracy = 0.9933333333333333 Epoch 31: loss = 0.06285995941490086, accuracy = 0.9946666666666667 Epoch 32: loss = 0.06161273885133333, accuracy = 0.9946666666666667 Epoch 33: loss = 0.059785157406702455, accuracy = 0.9953333333333333 Epoch 34: loss = 0.05784672082939381, accuracy = 0.9953333333333333 Epoch 35: loss = 0.056563795351406834, accuracy = 0.9966666666666667 Epoch 36: loss = 0.05558385835941364, accuracy = 0.996 Epoch 37: loss = 0.053422644645735026, accuracy = 0.996 Epoch 38: loss = 0.05203657270313028, accuracy = 0.998 Epoch 39: loss = 0.051556730099684714, accuracy = 0.996 Epoch 40: loss = 0.049753840860794464, accuracy = 0.998 Epoch 41: loss = 0.04861505538568997, accuracy = 0.9973333333333333 Epoch 42: loss = 0.04855883616941398, accuracy = 0.9966666666666667 Epoch 43: loss = 0.04660213373754598, accuracy = 0.998 Epoch 44: loss = 0.04543506905195473, accuracy = 0.9986666666666667 Epoch 45: loss = 0.04421333021249591, accuracy = 0.9986666666666667 Epoch 46: loss = 0.04323571659729185, accuracy = 0.9986666666666667 Epoch 47: loss = 0.04244608789909821, accuracy = 0.9986666666666667 Epoch 48: loss = 0.042233956026031694, accuracy = 0.998 Epoch 49: loss = 0.04137651449401532, accuracy = 0.9986666666666667 Epoch 50: loss = 0.04028853688055021, accuracy = 0.9986666666666667 Epoch 51: loss = 0.04002178043643203, accuracy = 0.9986666666666667 Epoch 52: loss = 0.03860820808529442, accuracy = 0.9993333333333333 Epoch 53: loss = 0.037766033151749534, accuracy = 0.9986666666666667 Epoch 54: loss = 0.03738936221090967, accuracy = 0.9993333333333333 Epoch 55: loss = 0.03722803719462149, accuracy = 0.9986666666666667 Epoch 56: loss = 0.035754538361255514, accuracy = 0.9993333333333333 Epoch 57: loss = 0.03529980374839103, accuracy = 0.9986666666666667 Epoch 58: loss = 0.034682753912205794, accuracy = 0.9993333333333333 Epoch 59: loss = 0.0340192951616642, accuracy = 0.9993333333333333 Epoch 60: loss = 0.03376117420931492, accuracy = 0.9986666666666667 Epoch 61: loss = 0.0330197375125858, accuracy = 0.9993333333333333 Epoch 62: loss = 0.03285746784509473, accuracy = 0.9993333333333333 Epoch 63: loss = 0.03222544558037743, accuracy = 0.9986666666666667 Epoch 64: loss = 0.03184300600431508, accuracy = 0.9993333333333333 Epoch 65: loss = 0.03104726044330413, accuracy = 0.9993333333333333 Epoch 66: loss = 0.03065206531915598, accuracy = 0.9993333333333333 Epoch 67: loss = 0.030218390606656712, accuracy = 0.9993333333333333 Epoch 68: loss = 0.03012565761925978, accuracy = 0.9993333333333333 Epoch 69: loss = 0.029489124209285515, accuracy = 0.9993333333333333 Epoch 70: loss = 0.02908652761680944, accuracy = 0.9993333333333333 Epoch 71: loss = 0.02875577477098771, accuracy = 0.9993333333333333 Epoch 72: loss = 0.028806669230446214, accuracy = 0.9993333333333333 Epoch 73: loss = 0.02798909208451092, accuracy = 0.9993333333333333 Epoch 74: loss = 0.027745955278722985, accuracy = 0.9993333333333333 Epoch 75: loss = 0.02738681912320972, accuracy = 1.0 Epoch 76: loss = 0.02705115776456636, accuracy = 0.9993333333333333 Epoch 77: loss = 0.026671483149406237, accuracy = 0.9993333333333333 Epoch 78: loss = 0.02654054022057572, accuracy = 1.0 Epoch 79: loss = 0.025995225192931484, accuracy = 1.0 Epoch 80: loss = 0.025733566196931505, accuracy = 1.0 Epoch 81: loss = 0.02546965633300529, accuracy = 1.0 Epoch 82: loss = 0.025292366042632047, accuracy = 1.0 Epoch 83: loss = 0.02490288885315363, accuracy = 1.0 Epoch 84: loss = 0.024708703131542872, accuracy = 1.0 Epoch 85: loss = 0.024421744723726244, accuracy = 1.0 Epoch 86: loss = 0.02415612240600401, accuracy = 1.0 Epoch 87: loss = 0.0242107644069187, accuracy = 1.0 Epoch 88: loss = 0.023728844704990183, accuracy = 1.0 Epoch 89: loss = 0.023433790024659028, accuracy = 1.0 Epoch 90: loss = 0.023297302541274798, accuracy = 1.0 Epoch 91: loss = 0.02309220356170118, accuracy = 1.0 Epoch 92: loss = 0.022835551034069576, accuracy = 1.0 Epoch 93: loss = 0.022582597127702924, accuracy = 1.0 Epoch 94: loss = 0.022465685149601722, accuracy = 1.0 Epoch 95: loss = 0.02220419113480437, accuracy = 1.0 Epoch 96: loss = 0.022187033775914237, accuracy = 1.0 Epoch 97: loss = 0.021863603060381095, accuracy = 1.0 Epoch 98: loss = 0.021682208492871818, accuracy = 1.0 Epoch 99: loss = 0.02149454333301383, accuracy = 1.0 Test Accuracy: 0.9326599326599326

Class 2 sklearn MLP Example Iteration 1, loss = 2.37029905 Iteration 2, loss = 2.26380956 Iteration 3, loss = 2.16220843 Iteration 4, loss = 2.08024079 Iteration 5, loss = 2.00289872 Iteration 6, loss = 1.93227071 Iteration 7, loss = 1.86649838 Iteration 8, loss = 1.80478412 Iteration 9, loss = 1.74669235 Iteration 10, loss = 1.69239376 Iteration 11, loss = 1.63971269 Iteration 12, loss = 1.59056035 Iteration 13, loss = 1.54346583 Iteration 14, loss = 1.49821128 Iteration 15, loss = 1.45543655 Iteration 16, loss = 1.41406274 Iteration 17, loss = 1.37469218 Iteration 18, loss = 1.33682624 Iteration 19, loss = 1.30040059 Iteration 20, loss = 1.26553731 Iteration 21, loss = 1.23206559 Iteration 22, loss = 1.19991211 Iteration 23, loss = 1.16906297 Iteration 24, loss = 1.13955396 Iteration 25, loss = 1.11099869 Iteration 26, loss = 1.08385392 Iteration 27, loss = 1.05739599 Iteration 28, loss = 1.03221186 Iteration 29, loss = 1.00792130 Iteration 30, loss = 0.98444112 Iteration 31, loss = 0.96194694 Iteration 32, loss = 0.94010942 Iteration 33, loss = 0.91920695 Iteration 34, loss = 0.89902747 Iteration 35, loss = 0.87948251 Iteration 36, loss = 0.86051630 Iteration 37, loss = 0.84242934 Iteration 38, loss = 0.82482081 Iteration 39, loss = 0.80799864 Iteration 40, loss = 0.79165789 Iteration 41, loss = 0.77579051 Iteration 42, loss = 0.76040643 Iteration 43, loss = 0.74560284 Iteration 44, loss = 0.73147907 Iteration 45, loss = 0.71756541 Iteration 46, loss = 0.70417056 Iteration 47, loss = 0.69123037 Iteration 48, loss = 0.67863234 Iteration 49, loss = 0.66635860 Iteration 50, loss = 0.65465954 Iteration 51, loss = 0.64322712 Iteration 52, loss = 0.63220574 Iteration 53, loss = 0.62148490 Iteration 54, loss = 0.61093957 Iteration 55, loss = 0.60087111 Iteration 56, loss = 0.59098984 Iteration 57, loss = 0.58154056 Iteration 58, loss = 0.57224004 Iteration 59, loss = 0.56326154 Iteration 60, loss = 0.55449202 Iteration 61, loss = 0.54607373 Iteration 62, loss = 0.53783662 Iteration 63, loss = 0.52980736 Iteration 64, loss = 0.52208395 Iteration 65, loss = 0.51440387 Iteration 66, loss = 0.50709418 Iteration 67, loss = 0.49990385 Iteration 68, loss = 0.49283375 Iteration 69, loss = 0.48609233 Iteration 70, loss = 0.47943101 Iteration 71, loss = 0.47303383 Iteration 72, loss = 0.46662706 Iteration 73, loss = 0.46046870 Iteration 74, loss = 0.45454249 Iteration 75, loss = 0.44858127 Iteration 76, loss = 0.44297829 Iteration 77, loss = 0.43737478 Iteration 78, loss = 0.43192624 Iteration 79, loss = 0.42670403 Iteration 80, loss = 0.42158924 Iteration 81, loss = 0.41637447 Iteration 82, loss = 0.41146769 Iteration 83, loss = 0.40665947 Iteration 84, loss = 0.40187389 Iteration 85, loss = 0.39730819 Iteration 86, loss = 0.39270380 Iteration 87, loss = 0.38836182 Iteration 88, loss = 0.38403872 Iteration 89, loss = 0.37975411 Iteration 90, loss = 0.37564117 Iteration 91, loss = 0.37162489 Iteration 92, loss = 0.36765050 Iteration 93, loss = 0.36387182 Iteration 94, loss = 0.35998180 Iteration 95, loss = 0.35632778 Iteration 96, loss = 0.35263144 Iteration 97, loss = 0.34911045 Iteration 98, loss = 0.34558233 Iteration 99, loss = 0.34220075 Iteration 100, loss = 0.33882596 Training set score: 0.959333 Test set score: 0.878788

My_MLP seems to be too good to be true? Any hint? Thank you!

Releases

No releases published

Packages

No packages published

Languages