-
-
Notifications
You must be signed in to change notification settings - Fork 99
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Revisit starter implementation policy for nextercism #82
Comments
I like the idea of having stubs for basic/entry level exercises or if an exercise introduces completely new language syntax specific concepts that a student can't figure easily on there own without much googling then a stub should be present. Otherwise for all exercises an empty file should suffice. This is how I think it should be :
After solving first 5 or so exercises we can expect students to go on there own without a stub. Test suites will provide enough information on how to proceed and what is required. |
I like the thought of "hand holding" for the first couple of exercises. I also think for any exercises we consider to be "core", then we should consider what level of stubbing is good. For "core" exercises, I see them as ways to in general present the language in and of itself, but also key concepts within the language which I envision can be a lot. But these "core" exercises are dependent on whether there are specifications that include any new concepts without being overly difficult. @kabiir As for the 5 levels of stubbing you presented, maybe we can codify them into categories? Otherwise, I was thinking of tying them with a difficulty level, but that may lead to confusion or conflict later. |
@Stargator key concepts also include introduction to syntactic sugars which I find difficult to be explained or introduced via test suites. But a clever example in exercise Readme or hints.md which intrigues a student to either explore or use the concept is what I'm aiming for. Now by a clever example I mean, not just serving a concept on a silver platter but maybe puzzling it a little so when they discover it the feel of it intrigues them more. So maybe after figuring out how to assign difficulties and deciding on the 5 level stubs any proceeding examples may/should contain such clever examples. |
I'm unsure of being "clever". Some concepts would have to be introduced via hints or suggestions and leave it to the user to decide how to implement the solution. Because, despite how clever a problem is, dart is flexible enough that we can't assume how a solution is developed. For example, if we hint or suggest to use Regular Expressions in a "core" exercise, which unlocks an exercise that explicitly depends on regular expressions (like values used in test suite are regular expressions), then it would be up to the user on how to get up to speed. The core exercise may hint or suggest Regular Expressions, but we have to tread lightly about what kind of concepts we can assume the user will already know. Using the I'm starting to think we may come to a point where we need to either update the README, add pages to the wiki, or create new documentation about the track's exercises and how they build off each other. |
Well exercises don't necessarily cover all aspects of dart world, which means we'll probably need to come up with our own exercises covering topics such as
Exercises explaining getting started with dart, like where are you supposed to use dart, how are you supposed to use dart and so on... Why would a student chose dart track on exercism in the first place? It's probably because they heard/saw angulardart/flutter/etc framework/platform usage of dart and how easy and fast it is to develop stuff with. I think we need to think through this and design a plan/policy of how dart is to be taught not just in the sense of exercism but also dart as its own world. Now, again, is it exercism's job to teach every framework out there? No! I don't think so. So we have to draw a line for our track of what things from dart world we will introduce and what will be left for the student to explore. |
Is exercism about teaching a language or providing an environment where people are given a problem to solve? Currently, Exercism itself doesn't have pages detailing each language's features. Each language track provides a cursory overview of the language, some steps for installing software needed to use the language, and then two pages with resources to dig deeper into the language. Outside of that, each exercise for a language may or may not provide language-specific information, hints, or suggestions. The central point being, it is up to each track to decide whether or not to include language-specific information in the REAME for an exercise. So I think it is a little bit of a stretch to say exercism teaches users about languages. Though again, the track's maintainers are given a lot of room to decide the granular details. Even after reading descriptions of v2 of exercism, I'm not convinced it's about "teaching", but more about providing a collaborative environment where people can reach out to teams, mentors, and track maintainers, if the resources (websites, documents, etc) they have are not helpful or clear. And if exercism is about teaching, then what does it mean to "teach"? Where is that fine line between providing just enough information to allow the person to solve the problem, giving them too much information, or solving the problem for them? That's not including language barriers or understanding new concepts by phrasing it differently then the "teacher" might. Summary:
|
I would say it provides an environment where people are given problems to solve, which in turn facilitates learning a language. Mentorship helps with learning further techniques and mechanics that others may not have been aware of before. The exercises themselves are not enough to completely learn a language though. They're a catalyst in my opinion.
I think that's more of a question for the folks who create the problem specifications and less about the Dart track. Either the specifications give them enough information or not. I will say that since each exercise has tests associated with them, it's easier for us to be able to stub out classes and methods for the students so that they at least have something to base their code off of. When we go to implement an exercise, the paradigms of Dart will need to be considered up-front. Do we have enough exercises that utilize I would say that having stubs in each exercise with the expected signatures removes the need for people to have to look at the tests, which I think they should do. I think our current model of providing either a class or high-order function stub works pretty well. If folks want to figure out a different way to solve a problem, they should feel empowered to change the tests and their implementation to learn something new. If we generate an exercise and think it could fit an educational use case that we haven't implemented yet, we should feel empowered to alter it to our will. |
@exercism/dart
The next version of Exercism breaks exercises into core exercises and branch exercises. It will therefore be possible for users to complete exercises in many different orders.
In light of this, we should revisit our current policy regarding starter implementations, which "assumes" a fixed order of exercise completion, and decide whether/how to update it. Discussion should occur in this issue thread.
This discussion is stemming out of exercism/discussions
Current policy for reference:
We currently just create the implementation file with an empty class most of the time. Should we stub other methods and classes? I think a few exercises also stub a method.
I know our intent was to have the user generate or implement each required piece that was missing until all that was left was for them to figure out the logic.
peterseng summed it up nicely with this:
The text was updated successfully, but these errors were encountered: