• gnus_migrate
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    1
    ·
    1 year ago

    If you wanted to introduce every industry best practice in an intro course you’d never get to the actual programming.

    It would be good to have a 1 credit course(one hour a week) where you learn industry best practices like version control, testing and stuff like that. But it definitely shouldn’t be at the start.

    • robinm@lemmyrs.org
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      I teachers were using automated tests instead of printf in their intro courses, it would be so much better. I don’t think that introducing all the various kind of tests is usefull, but just showing the concept of automated tests instead of manual ones would be a huge step forward.

      • gnus_migrate
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        The thing is the way they motivate new students to learn programming is by having them write programs that do something. Making a test green isn’t as motivating as visually seeing the output of your work, and test fixtures can be complex to set up depending on the language. I mean students don’t learn how to factor their code into methods until later into such a course, they’re learning if statements and for loops and basic programming constructs. Don’t you think having to explain setting up test fixtures and dependency inversion is a bit too much for people at that level?

    • sugar_in_your_tea@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Hard disagree. Cover less material if needed, but students should get into the habit of writing tests for everything they turn in. If I was a professor, I would reject any submitted code if it didn’t have tests, for the same reason that math teachers reject work if students don’t show their work.

      • gnus_migrate
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        1 year ago

        There’s a difference between tests and assertions. Students do test their code, however they don’t write assertions, as I said because you want the cognitive load to be as low as possible so that they can master the basics. I’m fine with tests being provided to them, however they should be focusing on learning the constructs at the start.

        In any field, the real life practice of a profession is something you learn working for an actual company, whether it’s through an internship or an entry level job. Ideally there should be unions or syndicates setting these standards so that they’re consistent across the field, just like with other knowledge based professions.

        Universities are not corporate training programs, and they aren’t supposed to be.

        • sugar_in_your_tea@sh.itjust.works
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          1 year ago

          A huge part of computer science is proving correctness, complexity, etc. Almost all of my classes had an automated test suite that your code needed to pass to get full credit for the assignment. I think it’s completely reasonable that you “show your work” by writing your own tests from the start.

          If programming is just one or two classes of your program (e.g. you’re doing IT or something), then I can insurance testing not being a part of it. But if you’re going after a formal CS or CS-adjacent degree, you should be in the habit of proving the correctness of your code.

          I’m totally fine with other industry norms being ignored, such as code style, documentation, and defensive programming, however, testing should absolutely be a regular part of any form of software development. I want every CS grad to always be thinking in terms of “how can I prove this” instead of just “how can I solve this.” I don’t think 100% code coverage should be expected, but students should prove the most important part of their solution.