Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

[Link] "On the Impossibility of Supersized Machines"

13 crmflynn 31 March 2017 11:32PM

The Future of Humanity Institute is hiring a project manager

5 crmflynn 26 January 2017 06:19PM

https://www.fhi.ox.ac.uk/project-manager/

Applications are invited for a full-time Research Project Manager with the Future of Humanity Institute (FHI) at the University of Oxford. The post is fixed-term for 18 months from the date of appointment.

Reporting to the Assistant Director at the Future of Humanity Institute, the successful candidate will be responsible for coordinating, monitoring and developing the activities of the institute. The postholder will be based at FHI, Littlegate House, St Ebbe’s Street, OX1 1PT.

The postholder’s main responsibilities will include: coordinating, monitoring and helping to develop the activities of the FHI’ s research programmes and actively seeking funding for the activities of the Institute; organising meetings, workshops, seminars and conferences and working in collaboration with Professor Nick Bostrom and other researchers; acting as an ambassador for the Institute’s research, both within Oxford and externally and give regular advice to foundations, philanthropists and policymakers interested in reducing existential risk as well as producing reports for government, industry and other relevant organisations.

Applicants will be familiar with existing research and literature in the field and have excellent communication skills, including the ability to write for publication. He or she will have experience of independently managing a research project and of contributing to large policy-relevant reports. Previous professional experience working for non-profit organisations, and a network in the relevant fields associated with existential risk may be an advantage, but are not essential.

Candidates should apply via this link, and must submit a CV and supporting statement as part of their application. The closing date for applications is 12.00 midday on Thursday 2 February 2017. Please contact fhijobs@philosophy.ox.ac.uk with questions about the role, and fhiadmin@philosophy.ox.ac.uk with questions about the application process.


 

[Link] FHI is accepting applications for internships in the area of AI Safety and Reinforcement Learning

5 crmflynn 07 November 2016 04:33PM

[Link] Rebuttal piece by Stuart Russell and FHI Research Associate Allan Dafoe: "Yes, the experts are worried about the existential risk of artificial intelligence."

9 crmflynn 03 November 2016 05:54PM

The University of Cambridge Centre for the Study of Existential Risk (CSER) is hiring!

6 crmflynn 06 October 2016 04:53PM

The University of Cambridge Centre for the Study of Existential Risk (CSER) is recruiting for an Academic Project Manager. This is an opportunity to play a shaping role as CSER builds on its first year's momentum towards becoming a permanent world-class research centre. We seek an ambitious candidate with initiative and a broad intellectual range for a postdoctoral role combining academic and project management responsibilities.

The Academic Project Manager will work with CSER's Executive Director and research team to co-ordinate and develop CSER's projects and overall profile, and to develop new research directions. The post-holder will also build and maintain collaborations with academic centres, industry leaders and policy makers in the UK and worldwide, and will act as an ambassador for the Centre’s research externally. Research topics will include AI safety, bio risk, extreme environmental risk, future technological advances, and cross-cutting work on governance, philosophy and foresight. Candidates will have a PhD in a relevant subject, or have equivalent experience in a relevant setting (e.g. policy, industry, think tank, NGO).

Application deadline: November 11th. http://www.jobs.cam.ac.uk/job/11684/

The Global Catastrophic Risk Institute (GCRI) seeks a media engagement volunteer/intern

5 crmflynn 14 September 2016 04:42PM

Volunteer/Intern Position: Media Engagement on Global Catastrophic Risk

http://gcrinstitute.org/volunteerintern-position-media-engagement-on-global-catastrophic-risk/

The Global Catastrophic Risk Institute (GCRI) seeks a volunteer/intern to contribute on the topic of media engagement on global catastrophic risk, which is the risk of events that could harm or destroy global human civilization. The work would include two parts: (1) analysis of existing media coverage of global catastrophic risk and (2) formulation of strategy for media engagement by GCRI and our colleagues. The intern may also have opportunities to get involved in other aspects of GCRI.

All aspects of global catastrophic risk would be covered. Emphasis would be placed on GCRI’s areas of focus, including nuclear war and artificial intelligence. Additional emphasis could be placed on topics of personal interest to the intern, potentially including (but not limited to) climate change, other global environmental threats, pandemics, biotechnology risks, asteroid collision, etc.

The ideal candidate is a student or early-career professional seeking a career at the intersection of global catastrophic risk and the media. Career directions could include journalism, public relations, advertising, or academic research in related social science disciplines. Candidates seeking other career directions would also be considered, especially if they see value in media experience. However, we have a strong preference for candidates intending a career on global catastrophic risk.

The position is unpaid. The intern would receive opportunities for professional development, networking, and publication. GCRI is keen to see the intern benefit professionally from this position and will work with the intern to ensure that this happens. This is not a menial labor activity, but instead is one that offers many opportunities for enrichment.

A commitment of at least 10 hours per month is expected. Preference will be given to candidates able to make a larger time commitment. The position will begin during August-September 2016. The position will run for three months and may be extended pending satisfactory performance.

The position has no geographic constraint. The intern can work from anywhere in the world. GCRI has some preference for candidates from American time zones, but we regularly work with people from around the world. GCRI cannot provide any relocation assistance.

Candidates from underrepresented demographic groups are especially encouraged to apply.

Applications will be considered on an ongoing basis until 30 September, 2016.

To apply, please send the following to Robert de Neufville (robert [at] gcrinstitute.org):

* A cover letter introducing yourself and explaining your interest in the position. Please include a description of your intended career direction and how it would benefit from media experience on global catastrophic risk. Please also describe the time commitment you would be able to make.

* A resume or curriculum vitae.

* A writing sample (optional).

The Future of Humanity Institute is hiring!

13 crmflynn 18 August 2016 01:09PM

FHI is accepting applications for a two-year position as a full-time Research Project Manager. Responsibilities will include coordinating, monitoring, and developing FHI’s activities, seeking funding, organizing workshops and conferences, and effectively communicating FHI’s research. The Research Program Manager will also be expected to work in collaboration with Professor Nick Bostrom, and other researchers, to advance their research agendas, and will additionally be expected to produce reports for government, industry, and other relevant organizations. 

Applicants will be familiar with existing research and literature in the field and have excellent communication skills, including the ability to write for publication. He or she will have experience of independently managing a research project and of contributing to large policy-relevant reports. Previous professional experience working for non-profit organisations, experience with effectiv altruism, and a network in the relevant fields associated with existential risk may be an advantage, but are not essential. 

To apply please go to https://www.recruit.ox.ac.uk and enter vacancy #124775 (it is also possible to find the job by searching choosing “Philosophy Faculty” from the department options). The deadline is noon UK time on 29 August. To stay up to date on job opportunities at the Future of Humanity Institute, please sign up for updates on our vacancies newsletter at https://www.fhi.ox.ac.uk/vacancies/.

Comment author: [deleted] 03 May 2016 01:17:41AM 1 point [-]

What sort of experience and education would make a candidate competitive for this position?

Comment author: crmflynn 04 May 2016 05:56:37PM 1 point [-]

My sense from talking with Professor Dafoe is that he is primarily interested in recruiting people based on their general aptitude, interest, and dedication to the issue rather than relying heavily on specific educational credentials.

Comment author: AlexMennen 03 May 2016 06:56:21AM 1 point [-]

Source?

Comment author: crmflynn 04 May 2016 05:52:43PM 1 point [-]

https://www.fhi.ox.ac.uk/vacancies-for-research-assistants/ It was not up on the website at the time you asked, but it is up now.

Paid research assistant position focusing on artificial intelligence and existential risk

7 crmflynn 02 May 2016 06:27PM

Yale Assistant Professor of Political Science Allan Dafoe is seeking Research Assistants for a project on the political dimensions of the existential risks posed by advanced artificial intelligence. The project will involve exploring issues related to grand strategy and international politics, reviewing possibilities for social scientific research in this area, and institution building. Familiarity with international relations, existential risk, Effective Altruism, and/or artificial intelligence are a plus but not necessary. The project is done in collaboration with the Future of Humanity Institute, located in the Faculty of Philosophy at the University of Oxford. There are additional career opportunities in this area, including in the coming academic year and in the future at Yale, Oxford, and elsewhere. If interested in the position, please email allan.dafoe@yale.edu with a copy of your CV, a writing sample, an unofficial copy of your transcript, and a short (200-500 word) statement of interest. Work can be done remotely, though being located in New Haven, CT or Oxford, UK is a plus.

View more: Next