Ka Papa Haʻawina Haʻuki Mīkini

Loiloi a me ka manaʻo o ke aʻo ʻana i ka mīkini google papa

Ua hana wau i ka papa hana hoʻomohala no Ka Papa Hana Crash a me ka Mīkini Google. Kahi papa hoʻolauna, kahi e hāʻawi ai lākou iā ʻoe i nā kumu a ʻike i nā laʻana o ka hoʻokō maoli ʻana me TensorFlow. ʻO kēia mau laʻana nā mea i paipai iaʻu e hana pēlā.

ʻO Crash vs Machine Learning Coursera

He papa maʻalahi ʻoi aku ia ma mua o kēlā Ke aʻo ʻana i ka mīkini Coursera aʻoi aku ka hana. E ʻōlelo mākou e nānā ka papa Coursera iā ʻoe i ka ʻike ʻana i ka hana ʻana o nā algorithms i ka wā ʻo Google Crash ʻo kēlā mau algorithms e like me nā pahu ʻeleʻele, hāʻawi lākou iā ʻoe i kahi wehewehe iki a aʻo iā ʻoe e hoʻokō me ka Tensor Flow.

A ʻo kēia ka ʻokoʻa nui. ʻO ka papa Google, ʻoiai e wehewehe nei i ka hohonu o ka hohonu i nā manaʻo like ʻole a me nā algorithms o ka Machine Machine, e aʻo iā mākou e hoʻopili iā lākou a hoʻomaka e hoʻohana iā TensorFlow a me Keras.

Hana ʻia nā hana āpau me ʻO Google Colab, me ia mea e hoʻomākaukau mua ʻia ka nohona hoʻomohala. He ʻokoʻa nui ia me ka papa Cursera e hana pū me Matalab a i ʻole Octave e hoʻokō i nā algorithms. Akā ʻaʻole ʻoe e ʻike i kekahi mea mai Tensorflow a pehea e hoʻoponopono ai i kahi pilikia maoli.

Ke kuhikuhi nei i kaʻu ʻōlelo i ka loiloi o kēlā papa

He manaʻo nui ʻole ia. Akā ʻo ia nō paha ke kumu he ala maikaʻi e hoʻomaka ai no ka mea ʻaʻole ʻoe e aʻo i ka mea e hana ai akā no ke aha ʻoe e hana ai.

- I ka manawa hea e koho ai i kahi algorithm a i ʻole ʻē aʻe.

- Pehea e koho a wehewehe ai i nā palena ʻokoʻa.

- He aha nā pilikia e hiki ke ala aʻe me nā hāmeʻa a ʻoi loa hoʻi nā ana e lawe ai.

ʻO kā Google Machine Learning Crash papa hiki ke hana ʻia inā ʻaʻole ʻoe he pae makemakika kiʻekiʻe, ʻaʻole ʻo Andrew Ng's Coursera's

Agena: He aha ka mea i ʻike ʻia i ka papa

papa hoʻolauna i ke aʻo ʻana i ka mīkini

ʻO ka mea mua, hoʻomaka ʻoe me ka wehewehe ʻana i ke ʻano o ka Learning Machine, nā manaʻo nui a me nā ʻano pilikia. A me kēia, ua hiki i ka manawa e kamaʻilio e pili ana i nā helu aʻe. E kala mai he nui nā huaʻōlelo i ka ʻōlelo Pelekania, akā aia ka papa ma ka ʻōlelo Pelekania (ʻoiai he maʻalahi loa e ukali iā ia) a ʻo ka nui o nā kī ʻaʻohe unuhi, a i ʻole ke unuhi ʻia ua nalowale ke ʻano, no ka mea i loko o ka pōʻaiapili o Ke aʻo ʻana i ka poʻe āpau a ma nā pūnaewele āpau e ʻōlelo iā lākou i ka ʻōlelo Pelekania.

  • Ka Hoʻihoʻi Linear a i ʻole Linear Regression
  • Lilo kuʻi: kahi hana pohō kaulana
  • ʻO Gradient Down a me Gradient Down Stochastic
  • Ka helu aʻo a i ʻole ka helu aʻo.
  • Hoʻolahalaha
  • ʻOhana
  • Hoʻonohonoho ʻia
  • ʻO ka helehelena helehelena me ke keʻa kea i hoʻokahi-wela mau mea lele
  • Nā Nolanarialities
  • Regularization (maʻalahi a me sparcity) (L1 a me L2)
  • Hoʻohuli loiloi
  • Hōʻuluʻulu
  • Ka pololei, pololei a me ka hoʻomanaʻo
  • ROC Curve a me AUC
  • Nā pūnaewele Neural (Hoʻomaʻamaʻa, Hoʻokahi me All, Softmax)
  • Komo iho

E like me kaʻu i ʻōlelo ai, hana pū me Google Colab.

Na wai ia

Inā ʻoe e hoʻomaka a makemake e aʻo e hoʻokō i nā laʻana maʻalahi. He ala maikaʻi ia e hoʻomaka ai.

Aia he 15 mau hola o ka papa i hiki iā ʻoe ke hana i kāu wikiwiki, a ʻoiai aia he mau hoʻomaʻamaʻa ʻaʻole pono ʻoe e hana i nā hāʻawi a i ʻole e hala i nā hoʻokolohua.

Manuahi ka papa.

A kēia manawa?

Ke wikiwiki nei lākou, e nānā nō wau i ke koena o lākou ma Google.

Ma waho aʻe o ka hoʻomau ʻana e hoʻāʻo i kekahi o nā nā papa a mākou i waiho ai i ka papa inoa e ʻike pehea lākou a inā wau e hana i kekahi mea koʻikoʻi ua ʻoi aku ka holomua.

He papahana koʻikoʻi kaʻu e hoʻomaka nei no ka hoʻokumu ʻana i kahi mea hana ma ka hana a ʻo ka mea e pono ai i kēia manawa e hoʻomaka wau e noi i nā mea āpau aʻu i aʻo ai i kēia manawa a e hakakā me nā pilikia maoli.

E hōʻike mau wau i koʻu holomua ma ka blog.

Haʻalele i ka manaʻo hoʻopuka