Determining Number of Inputs in Model Design

Hi All,

Given a classification problem, when trying to determine what input parameters to include in a model, is it generally better to include more parameters at the risk of some or many of them, possibly, not having any importance? Or, include fewer at the risk of possibly missing an important input? How detrimental to training, if at all, would it be to have hundreds of parameters if only a handful are relevant?

Thanks!

In Machine learning important features play a major role whereas in Deep Learning we can include many features as much as we can. Thank you

Hi @chunduriv, I appreciate the response. Sounds like using many features/parameters is not a problem and even preferable, yes?

Thanks!

Yes, in deep learning. Thank you