At Westonci.ca, we make it easy for you to get the answers you need from a community of knowledgeable individuals. Connect with a community of experts ready to provide precise solutions to your questions on our user-friendly Q&A platform. Explore comprehensive solutions to your questions from a wide range of professionals on our user-friendly platform.

in gradient descent technique, we chose an alpha value (learning rate) in computation of parameters (theta zero and theta 1). what will happen if we assign a very small value to alpha? 1) the model computations may take a long time to converge 2) the model may never converge 3) there will be no need to iterate 4) the speed of the computations will be very high

Sagot :