-
Book Overview & Buying
-
Table Of Contents
-
Feedback & Rating
The GitHub Copilot Handbook
By :
When using tools that generate parts of your code base, we always need to keep in mind several ethical aspects. There is an aspect of code generation itself where large language models display certain traits based on the way they have been trained. We already mentioned things such as bias in Chapter 2. Another aspect is how engineers use these tools and are diligent in the way the tools impact their way of working. If the engineer stops thinking and blindly accepts every output of the large language models, did they actually gain anything? We don’t think so. This aspect is more of a cultural effect that AI can have on how teams work: we need to refrain from just pointing at the tools and saying it was generated by AI and therefore it should just work. We are the human in the loop, and we need to bring our professional view to creating the business value for our end user.
As engineers, that means we need to stay on top of things and be the human...
Change the font size
Change margin width
Change background colour