EA - Winter ML upskilling camp by Nathan Barnard

The Nonlinear Library: EA Forum - En podcast av The Nonlinear Fund

Podcast artwork

Kategorier:

Link to original articleWelcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: Winter ML upskilling camp, published by Nathan Barnard on December 2, 2022 on The Effective Altruism Forum.Title: Apply for the ML Winter Camp in Cambridge, UK [2-10 Jan]TL;DR: We are running a UK-based ML upskilling camp from 2-10 January in Cambridge for people with no prior experience in ML who want to work on technical AI safety. Apply here by 11 December.We (Nathan Barnard, Joe Hardie, Quratul Zainab and Hannah Erlebach) will be running a machine learning upskilling camp this January in conjunction with the Cambridge AI Safety Hub. The camp is designed for people with little-to-no ML experience to work through a curriculum based on the first two weeks of MLAB under the guidance of experienced mentors, in order to develop skills which are necessary for conducting many kinds of technical AI safety research.The camp will take place from 2-10 January in Cambridge, UK.Accommodation will be provided at Emmanuel College.There are up to 20 in-person spaces; the camp will take place in the Sidney Street Office in central Cambridge.There is also the option to attend online for those who cannot attend in-person, although participants are strongly encouraged to attend in-person if possible, as we expect it to be substantially harder to make progress if attending online. As such, our bar for accepting virtual participants will be higher. We can cover travel costs if this is a barrier to attending in-person.Apply to be a participantWho we are looking forThe typical participant we are looking for will have:Strong quantitative skills (e.g., a maths/physics/engineering bakground)An intention to work on AI safety research projects which require ML experienceLittle-to-no prior ML experienceThe following are strongly preferred, but not essential:Programming experience (preferably Python)AI safety knowledge equivalent to having at least completed the AGI Safety Fundamentals alignment curriculumThe camp is open to participants from all over the world, but in particular those from the UK and Europe; for those located in the USA or Canada, we recommend (also) applying for the CBAI Winter ML Bootcamp, happening either in Boston or Berkeley (deadline 4 December).If you're unsure if you're a good fit for this camp, we encourage you to err on the side of applying. We recognise that evidence suggests that less privileged individuals tend to underestimate their abilities, and encourage individuals with diverse backgrounds and experiences to apply; we especially encourage applications from women and minorities.How to applyFill out the application form by Sunday 11 December, 23:59 GMT+0.Decisions will be released no later than 16 December; if you require an earlier decision in order to make plans for January, you can specify so in your application.Apply to be a mentorWe are looking for mentors to be present full- or part-time during the camp. Although participants will work through the curriculum in a self-directed manner, we think that learning can be greatly accelerated when there are experts on hand to answer questions and clarify concepts.We expect mentors to beExperienced ML programmersFamiliar with the content of the MLAB curriculum (it’s helpful, but not necessary, if they have participated in MLAB themselves)Knowledgeable about AI safety (although this is less important)Comfortable with teaching (past teaching or tutoring experience can be useful)However, we also acknowledge that being a mentor can be useful for gaining skills and confidence in teaching, and for consolidating the content in one’s own mind; we hope that being a mentor will also be a useful experience for mentors themselves!If needed, we are able to provide accommodation in Cambridge, and can offer compensation for your time at £100 for a half day or £200 for a full day. We understand that m...

Visit the podcast's native language site