Already a member?

Sign In
Syndicate content

Blogs

IASSIST Fellows 2013

 

The IASSIST Fellows Committee is glad to announce through this post the six recipients of the 2013 IASSIST Fellowship award. We are extremely excited to have such a diverse and interesting group with different backgrounds and experience and encourage IASSISTers to welcome them at our conference in Cologne, Germany.

Please find below their names, countries and brief bios:

Chifundo Kanjala (Tanzania) 

Chifundo currently works as a Data Manager and data documentalist for an HIV research group called ALPHA network based at London School of Hygiene and Tropical Medicine's department of Population Health, Chifundo spends most of his time in Mwanza, Tanzania but do travel from time around Southern and Eastern Africa to work with colleagues in the ALPHA network.Before joining the London School of Hygiene and Tropical Medicine, he was working as a Data analyst consultant at Unicef, Zimbabwe.Currently working part time on a PhD with London school of Hygiene and Tropical Medicine. He has an MPhil in Demography from university of Cape Town, South Africa and a BSc Statistics Honours degree from University of Zimbabwe.


Judit Gárdos (Hungary) 

Judit Gárdos studied Sociology and German Language and Literature in Budapest, Vienna and Berlin. She is PhD-candidate in sociology, with a topic on the philosophy, sociology and anthropology of quantitative sociology. She is young researcher at the Institute of Sociology of the Hungarian Academy of Sciences. Judit has been working at the digital archive and research group called "voicesofthe20century.hu" that is collecting qualitative, interview-based sociological research collections of the last 50 years. She is coordinating the work at the newly-funded Research Documentation Center of the Center for Social Sciences at the Hungarian Academy of Sciences.


Cristina Ribeiro (Portugal) 

Cristina Ribeiro is an Assistant Professor in Informatics Engineering at Universidade do Porto and a researcher at INESC TEC. She has graduated in Electrical Engineering, holds a Master in Electrical and Computer Engineering and a Ph.D. in Informatics. Her teaching includes undergraduate and graduate courses in information retrieval, digital libraries, knowledge representation and markup languages. She has been involved in research projects in the areas of cultural heritage, multimedia databases and information retrieval. Currently her main research interests are information retrieval, digital preservation and the management of research data.


Aleksandra Bradić-Martinović (Serbia) 

Aleksandra Bradić-Martinović, PhD is the Research Fellow at the Institute of Economic Sciences, Belgrade, Serbia. Her field of expertize is research of information and communication technology implementation in economy, especially in banking, payment system operations and stock exchange operations. Aleksandra is also engaged in education process in Belgrade Banking Academy at the following subjects: E-banking and Payment Systems, Stock Market Dealings and Management Information Systems. She was engaged at several projects in the field of education. At the FP7 SERSCIDA project she is a Serbia team coordinator.


Anis Miladi (Tunisia) 

Anis Miladi earned his Bachelor degree in computer sciences and multimedia in 2007 and a Master degree in Management of Information Systems and organizations in 2008 and he is currently finalizing his master degree in project management(projected date summer 2013). Before joining the Social and Economic Survey Research Institute at Qatar University as Survey Research technology specialist in 2009, he worked as a programmer analyst in a private IT services company In Tunisia. His Area of expertise includes managing computer assisted surveys CAPI,CATI(Blaise surveying system)  in addition to Enterprise Document Management Systems, Enterprise Portals (SharePoint).


Lejla Somun-Krupalija (Sarajevo) 

Lejla currently serves as the Senior Program and Research Officer at the Human Rights Centre of the University of Sarajevo. She has over 15 years of experience in research, policy development in social inclusion issues. She is the Project Coordinator of the SERSCIDA FP7 project that aims to open data services/archives in the Western Balkan region in cooperation with CESSDA members. She had been engaged in the NGO sector previously, particularly on issues of capacity building and policy development in the areas of gender equality, the rights of persons with disabilities and issues of social inclusion and forced migration. She teaches academic writing, qualitative research, and gender and nationalism at the University of Sarajevo. 

IASSIST 2013 - Early bird registration rates until April 30!

GESIS – Leibniz Institute for the Social Sciences is proud to host the IASSIST 2013 Conference at Maternushaus in Cologne, Germany from May 28-31.  The IASSIST 2013 theme is Data Innovation: Increasing Accessibility, Visibility and Sustainability.  In line with the theme, the IASSIST Program is streamed into three tracks this year: Research Data Management, Data Developers and Tools and Data Public Services.  Presentations cover topics such as standards and processes in data management, metadata extensions and tools, data citation practices, sensitive data and much more! To see the full program and to register visit the website here: http://www.iassist2013.org/iassist-2013-home/.  Early bird registration rates are still available until April 30th.  See you in Cologne! (IASSIST 2013 Program and Local Arrangements Committees)

Newly elected IASSIST officials

Dear IASSISTers,

With a 59% voter turnout, the following people have been elected as our IASSIST Officers, whose terms begin at the end of the Annual Business Meeting of the Association, at lunchtime on Thursday 30 May 2013:

President: Bill Block

Vice President: Tuomas J. Alaterä

Treasurer: Thomas Lindsay

Secretary: Kristin Partlo

African Regional Secretary: Lynn Woolfrey

Asia-Pacific Regional Secretary: Sam Spencer

Canadian Regional Secretary: Michelle Edwards

European Regional Secretary: Tanvi Desai

US Regional Secretary: San Cannon

Admin committee member-at-large, Canada: Maxine Tedesco

Admin committee member-at-large, Europe: Laurence Horton

Admin committee members-at-large, USA: Amy Pienta, Lynda Kellam and Harrison Dekker

Congratulations to our newly elected officials, and I hope more people are encouraged to come forward and stand for positions in the next IASSIST election, which will be held in March 2015!

Melanie Wright

IASSIST Past President and Elections Chair

IASSISTers and librarians are doin' it for themselves

See video

 

Hey IASSISTers (gents, pardon for the video pun - couldnt' resist),

Are librarians at your institutions struggling to get up to speed with research data management (RDM)? If they're not, they probably should be. Library organisations are publishing reports and issuing recommendations left and right, such as the LIBER (Association of European Research Libraries) 2012 report, "Ten Recommendations for Libraries to Get Started with Research Data Management" (PDF). Just last week Nature published an article highlighting what the Great and the Good are doing in this area: Publishing Frontiers: The Library Reboot.

So the next question is, as a data professional, what are you doing to help the librarians at your institution get up to speed with RDM? Imagine (it isn't that hard for some of us) having gotten your Library masters degree sometime in the last century and now being told your job includes helping researchers manage their data? Librarians are sturdy souls, but that notion could be a bitter pill for someone who went into librarianship because of their love of books, right?

So you are a local expert who can help them. No doubt there will be plenty of opportunities for them to return the favour.

If you don't consider yourself a trainer, that's okay. Tell them about the Do-It-Yourself Research Data Management Training Kit for Librarians, from EDINA and Data Library, University of Edinburgh. They can train themselves in small groups, making use of reading assignments in MANTRA, reflective writing questions, group exercises from the UK Data Archive, and plenty of discussion time, to draw on their existing rich professional experience.

And then you can step in as a local expert to give one or more of the short talks to lead off the two hour training sessions in your choice of five RDM topics.Or if you're really keen, you can offer to be a facilitator for the training as a whole.Either way it's a great chance to build relationships across the institution, review your own knowledge, and raise your local visibility. If you're with me so far, read on for the promotional message about the training kit.

DIY Research Data Management Training Kit for Librarians

EDINA and Data Library, University of Edinburgh is pleased to announce the public release of the Do-It-Yourself Research Data Management Training Kit for Librarians, under a CC-BY licence:

http://datalib.edina.ac.uk/mantra/libtraining.html.

 The training kit is designed to contain everything needed for librarians in small groups to get themselves up to speed on five key topics in research data management - with or without expert speakers.

 The kit is a package of materials used by the Data Library in facilitating RDM training with a small group of librarians at the University of Edinburgh over the winter of 2012-13. The aim was to reuse the MANTRA course developed by the Data Library for early career researchers in a blended learning approach for academic liaison librarians.

 The training comprises five 2-hour face-to-face sessions. These open with short talks followed by group exercises from the UK Data Archive and long discussions, in a private collegiate setting. Emphasis is placed on facilitation and individual learning rather than long lectures and passive listening. MANTRA modules are used as reading assignments and reflective writing questions are designed to help librarians 'put themselves in the shoes of the researcher'. Learning is reinforced and put into practice through an independent study assignment of completing and publishing an interview with a researcher using the Data Curation Profile framework developed by D2C2 at Purdue University Libraries.

 The kit includes:

 * Promotional slides for the RDM Training Kit

* Training schedule

* Research Data MANTRA online course by EDINA and Data Library, University of Edinburgh: http://datalib.edina.ac.uk/mantra

* Reflective writing questions

* Selected group exercises (with answers) from UK Data Archive, University of Essex - /Managing and sharing data: Training resources./ September, 2011 (PDF). Complete RDM Resources Training Pack available: http://data-archive.ac.uk/create-manage/training-resources

* Podcasts (narrated presentations) for short talks by the original Edinburgh speakers (including from the DCC) if running course without ‘live’ speakers.

* Presentation files - if learners decide to take turns presenting each topic.

* Evaluation forms

* Independent study assignment: Data Curation Profile, from D2C2, Purdue University Libraries. Resources available: http://datacurationprofiles.org/

 As data librarians, we are aware of a great deal of curiosity and in some cases angst on the part of academic librarians regarding research data management. The training kit makes no assumptions about the role of librarians in supporting research data management, but aims to empower librarians to support each other in gaining confidence in this area of research support, whether or not they face the prospect of a new remit in their day to day job. It is aimed at practicing librarians who have much personal and professional experience to contribute to the learning experience of the group.

Become rich and famous: publish in the IQ!

These days many IASSIST members have received acceptance for their papers to the upcoming conference IASSIST 2013 in Cologne. There will be many interesting presentations at the conference. The conference presentation is your chance to present a project you are involved in, to air your argumentation for special areas, and in general to add to the IASSIST knowledge bank.

Projects are typically focused on support of social science research but the IASSIST related support now takes many forms with the developments of technology and applications. With your presentation at the conference you will have discussions and improvements of your work. After the conference you can in addition to the presentation at the conference reach a greater audience by publishing a revised paper in a coming issue of the IQ. Articles for the IASSIST Quarterly are always very welcome. They can be papers from IASSIST conferences or other conferences and workshops, from local presentations or papers especially written for the IQ.

If you are chairing a conference session you have the opportunity to become guest editor and to aggregate and integrate papers on a common subject for a special issue of the IQ.

Authors are very welcome to take a look at the instructions and article template on the IASSIST website. Authors and guest editors can also contact the editor via e-mail: kbr@sam.sdu.dk.

Karsten Boye Rasmussen     -    March 2013

Introducing the IASSIST Data Visualization Interest Group (DVIG!)

Hello fellow IASSISTer’s

     With the upcoming 2013 Conference nearing, we thought it very fitting to introduce you all to the newly created IASSIST Data Visualization Interest Group. Formed over the winter and now spring of 2013, this group brings together over 46 IASSIST members from across the world (literally across-the-world! check out the map of our locations), who are all interested in data visualization.  We hope to share a range of skills and information around tools, best practice visualization, and discuss innovative representations of data, statistics, and information. Here is just a glimpse of our group’s tools exposure.

    As research becomes more interdisciplinary and data and information are more readily used and reused, core literacies surrounding the use and understandability of data are required. Data Visualization supports a means to make sense of data, through visual representation, and to communicate ideas and information effectively. And, it is quickly becoming a well-developed field not only in terms of the technology (in the development of tools for analyzing and visualizing data), but also as an established field of study and research discipline. As data and information professionals, we are required to stay abreast of the latest technologies, disciplines, methods and techniques, used for research in this data-intensive and changing research landscape. Data Visualization, with its many branches and techniques seeks to present data, information, and statistics in new ways, ways that our researchers are harnessing with the use of high-powered computers (and sometimes not so high-powered) to perform analysis of data.  From conventional ways to visualize and graph data – such as tables, histograms, pie charts, and bar and line graphs, to the often more complex network relationship models and diagrams, cluster and burst analysis, and text analysis charts; we see data visualization techniques at play more than ever. 

This group has set a core mission and charge to focus on promoting a greater understanding of data visualization – its creation, uses, and importance in research, across disciplines.  Particular areas of focus include, but are not limited to the following:

  • Enable opportunities for IASSIST members to learn and enhance their skills in this growing field;
  • Support a culture of best practice for data visualization techniques; creation, use, and curation;
  • Discussion of the relevant tools (programs, web tools, and software) for all kinds of data visualizations (spatial, temporal, categorical, multivariate, graphing, networks, animation, etc.);
  • Provide input and feedback on data visualization tools;
  • Capture examples of data visualization to emulate and avoid;
  • Explore opportunities for service development in libraries;
  • Be aware of and communicate to others the needs of researchers in this field;
  • Use of data visualization for allowing pre-analysis browsing of data content in repositories
  • Connect with communities of metadata developers and users (e.g., DDI Alliance) to gain better understanding of how metadata can enable better visualization, and how in turn visualization need might drive development of metadata standards.
  • And more!

Please join me in welcoming this new interest group, and we hope to share and learn from you all at the upcoming conference! We are always seeking input and to share ideas, please get in touch with us at iassist-dataviz@lists.carleton.edu (either myself or another member can add you to the group).

All the best, and Happy Easter!

Amber Leahey

Update from COSSA: Changes to the Common Rule: The Implications for the Social and Behavioral Sciences

This is from the COSSA Newsletter (Consortium of Social Science Associations). March 25, 2013 Volume 32, Issue 6 Regarding a workshop on proposed changes to the Common Rule.  Readers of these blog entries will recall that these proposed changes would require that data identitified in social science research would be required to meet HIPPA standards; potentially rendering many public datasets unuseful for research purposes.

A link to the webcast is here: http://sites.nationalacademies.org/DBASSE/BBCSS/CurrentProjects/DBASSE_080452#Workshop  George Alter spoke on the panel on Data Security and Sharing.

Here is a summary of the COSSA report:

On March 21 and 22, the National Academies' Board on Behavioral, Cognitive, and Sensory Sciences (BBCSS) held a workshop on the "Proposed Revisions to the Common Rule in Relation to the Behavioral and Social Sciences." In 2011, the Department of Health and Human Services proposed changes to the Common Rule, the regulations governing the protection of human subjects in research, in an Advanced Notice of Proposed Rulemaking (ANPRM). (For more information, see Update, January 28, 2013 and click here for a response to the ANPRM from the social and behavioral science community.) Several COSSA member organizations helped sponsor the workshop. More information about the workshop, including presenters' slides and an archived webcast, is available here. BBCSS will publish a summary report of the workshop. According to Robert Hauser, Executive Director of the Division of Behavioral and Social Sciences and Education (DBASSE), the Academies expect to convene a panel a panel that will produce a consensus report with conclusions and recommendations.

 

The workshop's opening session reviewed existing knowledge and evidence about the functioning of the Common Rule and Institutional Review Boards (IRBs). Connie Citro, Director of the Committee on National Statistics at the National Academies, gave an overview of the many National Academies' reports on human subjects protection published since 1979 and summarized the lessons learned. She pointed to four major takeaways from the existing literature. First, one-size-fits-all approaches often have unanticipated negative consequences. Second, there is no need to reinvent the wheel regarding human subjects' protection. Third, a balance needs to be struck between leaving subjects vulnerable and handicapping researchers. Finally, the social and behavioral sciences (SBS) are often not given the same consideration as the biomedical sciences in writing regulations and thus need to be constantly vigilant to make sure that new rules are appropriate for a SBS context.

 

Noting that there is a relatively small evidence base on the efficacy of the Common Rule and IRBs, Jeffrey Rodamar, Department of Education, reviewed some of the existing data. He found that despite popular perception, IRBs function pretty well. They are generally no more of an administrative burden than other grant-related activities; on average, review takes less than three percent of a study's time; a majority of studies are approved; expedited review takes less than a month on average and full review takes less than two months; and extreme delays are statistically uncommon. Rodamar described data showing that both SBS and biomedical researchers generally approve of the IRB system. He conceded that there are some problems with the Common Rule regulations and IRBs, but, paraphrasing Winston Churchill, suggested that perhaps "IRBs are worst form of governing research except for all those other forms that have been tried from time to time."

 

The "Minimal Risk" Standard

 

The second session, moderated by Celia Fischer, Fordham University, focused on the types of "risks and harms" encountered in SBS research. Richard T. Campbell delved into the concept of "minimal risk," an important area for researchers dealing with human subjects. The determination of whether participation in a study represents a "minimal risk" dictates the level of IRB review that takes place. Under the Common Rule, a study represents minimal risk if "the probability and magnitude of harm or discomfort anticipated in the research are not greater in and of themselves than those ordinarily encountered in daily life or during the performance of routine physical or psychological examinations or tests." Noting that it is a "cognitively complex" concept, Campbell suggested that risk can be thought of as the relationship between the probability of harm occurring and the severity of potential harm. Thus, the Common Rule provides some flexibility in that it does not dictate that both probability severity must be "minimal," just that, as probability increases, the severity of possible harm must decrease (and vice versa). Given that other parts of the definition are also thorny (such as what is meant by "daily life"), Campbell suggested that the Office of Human Research Protection (OHRP) could provide guidance to facilitate more consistent application of the minimal risk benchmark.

 

Brian Mustanski, Northwestern University Feinberg School of Medicine, spoke about his research on risky and sensitive behavior (such as drug use, sexual behavior, and HIV) in youth, which are topics that often make IRBs skittish. He conducted a study that was reviewed by two IRBs. One board approved it immediately, while the other delayed the study for six months because it was felt to be a "slight increase" over minimal risk. However, when Mustanski surveyed his subjects, a large majority felt that their participation was less uncomfortable than a routine medical exam (the minimal risk standard). Mustanski argued that such institutional reluctance to approve research into controversial or sensitive subjects as minimal risk can have a chilling effect, leading to a poor evidence base for interventions with already underserved populations, which is indeed the case regarding HIV prevention in LGBT youth.

 

Steve Breckler, American Psychological Association and COSSA Board Member, discussed the concept of risk in the SBS context. He reminded the audience that the broad goal of assessing risk is to calibrate the level of review to the level of risk a study poses to participants, in other words, to protect subjects and reduce unnecessary regulatory burden. He argued that the social science community should put greater focus on producing evidence to determine how well regulations are working and that having better guidance and tools for assessing risk would facilitate the work of IRBs.

 

Charles Plott, California Institute of Technology and a former COSSA Board member, posed the question of whether the entirety of the research endeavors for some fields, like economics, political science, game theory and decision science, could be said to be wholly without risk. In a survey of economics, political science, and judgment and decision making associations, Plott found very low numbers of adverse incidents and reports of harm, all of which were low-magnitude events (such as feelings of stress or frustration). He argued that some research topics-- markets, committees and voting, games, processes, and decisions-- and some research methods-- questionnaires, computer games, etc.-- can be said to pose no potential harm to subjects and should thus be exempted from consideration under the Common Rule.

 

Informed Consent and Special Populations

 

A session on the consent process and special populations was moderated by Margaret Foster Riley, University of Virginia. Sally Powers, University of Massachusetts, Amherst, discussed how consent operates in her research on depression, which collects "rich" behavioral and biospecimen data (which can be recoded and analyzed as part of future analysis). The proposed changes to the Common Rule would require that prior consent is obtained for re-analysis of biospecimens, but that consent should be given for open-ended use of specimens. However, the changes do not address rich behavioral data; Powers argued that the same standards should be applied.

 

Roxane Cohen Silver explained how she conducts research on victims of disasters and traumatic experiences (like natural disasters, infant death, and mass shootings) shortly after such events occur. Silver argued that such research can be conducted ethically and sensitively, if participants are given multiple opportunities to opt out, are allowed to refuse to answer questions and researchers and staff are well-trained. Noting that this type of research is most valuable if it is commenced immediately after a traumatic event, Silver described her arrangement with her IRB, which pre-approved a generic post-disaster proposal. In the aftermath of a traumatic event, Silver provides the IRB with specific information and can get full approval within 48 hours.

 

Celia Fischer, Fordham University, spoke about some of the issues involving obtaining informed consent from children. She argued that simplifying consent forms, as proposed by the ANPRM, would be useful. However, relying on standardized forms can be problematic for certain types of research and subjects of different ages, language skills, and educational backgrounds. Fischer observed that verbal consent can be a better form in certain contexts. She also noted that emancipated minors are often not treated as full adults by IRBs, despite being adults under the law. Fischer pointed out the issue of re-obtaining consent from adults, for whom parental consent had been granted when they were minors.

 

Data Security and Sharing

 

David Weir, University of Michigan Survey Research Center, moderated a panel on "Data Use and Sharing and Technological Advancement." The proposals in the ANPRM would mandate that all studies that collect identifiable or potentially identifiable data to have data security plans. George Alter, University of Michigan Interuniversity Consortium for Political and Social Research (ICPSR), which archives and protects social science data, spoke about some of the ways data can be kept secure. Informational risk can be reduced by improving study design (implementing certain sampling procedures, using multiple sites), having protection plans in place, using data repositories and archives, and training. ICPSR restricts data based on the degree of risk of disclosure and severity of harm from that disclosure, from publically releasing data online to requiring researchers work with data in physical data enclaves.

 

Taylor Martin, University of Utah, spoke about the data security implications of her research into math learning, which collects rich data from children playing online educational games. This type of research shows promise in terms of providing new information about how different kinds of children learn and how we can teach them better. However, concerns about data security can have a chilling effect on data sharing and reuse among researchers. Martin observed that for-profit companies are collecting data and doing the same kind of research without having to go through the same hurdles as researchers.

 

Susan Bouregy, Yale University Human Research Protection Program, raised concerns about the ANPRM's proposal to apply HIPAA standards for deidentification of data (requiring removal of 18 specific identifiers). Bouregy noted such standards may make some data sets unusable while ignoring other ways individuals could be identified. She also argued that some of the mandated HIPAA security elements are not appropriate for certain types of social science research. Furthermore, it ignored that not all identified data is risky. Finally, Boregy suggested that the ANPRM's requirement that all suspected data breaches be reported should be made more flexible and allow IRBs to tailor reporting to the context of each situation.

 

Multi-Disciplinary and Multi-Site Studies

 

Robert Levine, Yale, University, moderated a session focused on multi-disciplinary and multi-site studies. Pearl O'Rourke, Partners Health Care System, discussed the requirement that multi-site studies use a single IRB of record. She noted that having a central IRB does not absolve the individual institutions of fulfilling a number of responsibilities in overseeing and approving research. O'Rourke was concerned that mandating a central IRB would not address the complexity of each situation. Furthermore, the requirement underestimates the costs and time involved in running a central IRB.

 

Laura Stark, Vanderbilt University Center for Medicine Health and Society, gave an ethnographic perspective on IRB decision-making. As an explanation for why IRBs reach different conclusions regarding the risk level of similar research, Stark suggested the concept of "local precedents," or allowing past decisions to govern the evaluation of subsequent research. Such precedents may lead to faster decisions and internal consistency, but they can be problematic for researchers working with multiple IRBs. Stark offered three strategies to work around local precedents: 1) study networks (having a central IRB for multiple sites), 2) collegial review (allowing departmental experts to review research), and 3) decision repositories (online archives of approved protocols from many IRBs).

 

Thomas Coates, University of California, Los Angeles Program in Global Health, shared his experience with multinational studies (which are not addressed by the ANPRM). Some concerns he encountered included whether requiring other countries to adhere to U.S. requirements could be considered paternalistic, how to evaluate minimum risk in different cultural and economic contexts, and how to harmonize U.S., international, and local regulations. Coates also stressed the importance of receiving approval from local bodies in addition to U.S.-based IRBs.

 

The Scope of Institutional Review Boards

 

A final session, moderated by Yonette Thomas, Howard University and a COSSA board member, focused on the "Purview and Roles of IRBs." Lois Brako, University of Michigan, discussed the ANPRM's proposed changes from the perspective of an IRB that has made strides to become more innovative and flexible. Brako praised the ANPRM's proposals to reduce the oversight burden for minimal risk studies, eliminate annual review, and harmonize federal regulations (so long as the harmonization does not take the form of a unilateral one-size-fits-all approach). However, she argued that some of the proposals are unnecessarily burdensome, including requiring all institutions that receive Common Rule funding to be subject to federal oversight, some of the information security provisions, requiring reports of all adverse events to be submitted and stored in a central database, and expanding "human subjects" to include deidentified biospeicimens. Brako also suggested that in some cases, clearer guidance from OHRP would be more helpful than changed regulations.

 

Rena Lederman, Princeton University, observed that the Common Rule regulations were written from a biomedical perspective and are particularly unsuited for certain types of SBS research, such as anthropological fieldwork. Anthropologists establish thick relationships with their subjects, immerse themselves in other cultures, and do not test hypotheses or run controlled experiments. The ANPRM's requirements for informational security could cripple anthropological research (anthropologists' detailed fields notes would treated as data with informational risks under the new rules, raising the question of how such notes could be deidentified). Rather than trying to adapt the Common Rule to fit SBS research, Lederman proposed that it be only applied to biomedical research. She proposed the creation of a National Commission to develop an alternative guidance and framework to address SBS research risks.

 

Cheryl Crawford Watson, National Institute of Justice (NIJ), discussed the Department of Justice's (DOJ) approach to confidentiality and how it differs from other regulations regarding human subjects protection. Researchers funded by DOJ must submit a Privacy Certificate, which protects researchers and data from subpoena. It also prevents the researcher from violating subjects' privacy for any reason other than future criminal conduct. The DOJ privacy certificate differs from the certificate of confidentiality mandated by other agencies (like Health and Human Services) in that it prohibits researchers from reporting child abuse, reportable communicable diseases, and threatened harm to self or others. In order to be allowed to report such abuse, researchers must get the subjects to sign a separate consent-to-report form. The certificate is so strict due to concerns that few of the subjects under DOJ's purview would consent to participate in research otherwise.



IASSIST 2013 Fellows update

This year the IASSIST Fellows Committee received a grand total of 44 Fellows applications from a strong range of candidates from 28 countries around the globe: 
  • 18 Asia    

  • 13 Africa

  • 7 Europe

  • 3 North America

  • 2 Latin America

  • 1 Australia

Applications have been evaluated by the IASSIST Fellows Committee and offers have been made to a number of prospective Fellows to attend the annual conference in Cologne, Germany. We shall announce the names of those who have accepted the Fellows awards shortly.

We look forward to welcoming the new members at what will no doubt be the best IASSIST ever!

Best Wishes

Co-Chairs of the Fellows Committee

Some reflections on research data confidentiality, privacy, and curation by Limor Peer

Some reflections on research data confidentiality, privacy, and curation

Limor Peer

Maintaining research subjects’ confidentiality is an essential feature of the scientific research enterprise. It also presents special challenges to the data curation process. Does the effort to open access to research data complicate these challenges?

A few reasons why I think it does: More data are discoverable and could be used to re-identify previously de-identified datasets; systems are increasingly interoperable, potentially bridging what may have been insular academic data with other data and information sources; growing pressure to open data may weaken some of the safeguards previously put in place; and some data are inherently identifiable

But these challenges should not diminish the scientific community’s firm commitment to both principles. It is possible, and desirable, for openness and privacy co-exist. It will not be simple to do, and here’s what we need to keep in mind:

First, let’s be clear about semantics. Open data and public data are not the same thing. As Melanie Chernoff observed, “All open data is publicly available. But not all publicly available data is open.” This distinction is important because what our community means by open (standards, format) may not be what policy-makers and the public at large mean (public access). Chernoff rightly points out that “whether data should be made publicly available is where privacy concerns come into play. Once it has been determined that government data should be made public, then it should be done so in an open format.” So, yes, we want as much data as possible to be public, but we most definitely want data to be open.

Another term that could be clarified is usefulness. In the academic context, we often think of data re-use by other scholars, in the service of advancing science. But what if the individuals from whom the data were collected are the ones who want to make use of it? It’s entirely conceivable that the people formerly known as “research subjects” begin demanding access to, and control over, their own personal data as they become more accustomed to that in other contexts. This will require some fresh ideas about regulation and some rethinking of the concept of informed consent (see, for example, the work of John Wilbanks, NIH, and the National Cancer Institute on this front). The academic community is going to have to confront this issue.

Precisely because terms are confusing and often vaguely defined, we should use them carefully. It’s tempting to pit one term against the other, e.g., usefulness vs. privacy, but it may not be productive. The tension between privacy and openness or transparency does not mean that we have to choose one over the other. As Felix Wu says, “there is nothing inherently contradictory about hiding one piece of information while revealing another, so long as the information we want to hide is different from the information we want to disclose.” The complex reality is that we have to weigh them carefully and make context-based decisions.

I think the IASSIST community is in a position to lead on this front, as it is intimately familiar with issues of disclosure risk. Just last spring, the 2012 IASSIST conference included a panel on confidentiality, privacy and security. IASSIST has a special interest group on Human Subjects Review Committees and Privacy and Confidentiality in Research. Various IASSIST members have been involved with heroic efforts to create solutions (e.g., via the DDI Alliance, UKDA and ICPSR protocols) and educate about the issue (e.g., ICPSR webinar , ICPSR summer course, and MANTRA module). A recent panel at the International Data Curation Conference in Amsterdam showcased IASSIST members’ strategies for dealing with this issue (see my reflections about the panel).

It might be the case that STEM is leading the push for open data, but these disciplines are increasingly confronted with problems of re-identification, while the private sector is increasingly being scrutinized for its practices (see this on “data hops”). The social (and, of course, medical) sciences have a well-developed regulatory framework around the issue of research ethics that many of us have been steeped in. Government agencies have their own approaches and standards (see recent report by the U.S. Government Accountability office). IASSIST can provide a bridge; we have the opportunity to help define the conversation and offer some solutions.

In search of: Best practice for code repositories?

I was asked by a colleague about organized efforts within the economics community to develop or support repositories of code for research.  Her experience was with the astrophysics world which apparently has several and she was wondering what could be learned from another academic community.  So I asked a non-random sample of technical economists with whom I work, and then expanded the question to cover all of social sciences and posed the question to the IASSIST community. 

In a nutshell, the answer seems to be “nope, nothing organized across the profession” – even with the profession very broadly defined.  The general consensus for both the economics world and the more general social science community was that there was some chaos mixed with a little schizophrenia. I was told there are there are instances of such repositories, but they were described to me as “isolated attempts” such as this one by Volker Wieland:  http://www.macromodelbase.com/.  Some folks mentioned repositories that were package or language based such as R modules or SAS code from the SAS-L list or online at sascommunity.org.

Many people pointed out that there are more repositories being associated with journals so that authors can (or are required to) submit their data and code when submitting a paper for publication. Several responses touched on this issue of replication, which is the impetus for most journal requirements, including one that pointed out a “replication archive” at Yale (http://isps.yale.edu/research/data).  I was also pointed to an interested paper that questions whether such archives promote replicable research (http://www.pages.drexel.edu/~bdm25/cje.pdf) but that’s a discussion for another post.

By far, the most common reference I received was for the repositories associated with RePEc (Research Papers in Economics) which offers a broad range of services to the economic research community.  There you’ll find the IDEAS site (http://ideas.repec.org/) and the QM&RBC site with code for Dynamic General Equilibrium models (http://dge.repec.org/) both run by the St. Louis Fed.

I also heard from support folks who had tried to build a code repository for their departments and were disappointed by the lack of enthusiasm for the project. The general consensus is that economists would love to leverage other people’s code but don’t want to give away their proprietary models.  They should know there is no such thing as a free lunch! 

 I did hear that project specific repositories were found to be useful but I think of those as collaboration tools rather than a dissemination platform.  That said, one economist did end his email to me with the following plea:  “lots of authors provide code on their websites, but there is no authoritative host. Will you start one please?”

/san/

  • IASSIST Quarterly

    Publications Special issue: A pioneer data librarian
    Welcome to the special volume of the IASSIST Quarterly (IQ (37):1-4, 2013). This special issue started as exchange of ideas between Libbie Stephenson and Margaret Adams to collect

    more...

  • Resources

    Resources

    A space for IASSIST members to share professional resources useful to them in their daily work. Also the IASSIST Jobs Repository for an archive of data-related position descriptions. more...

  • community

    • LinkedIn
    • Facebook
    • Twitter

    Find out what IASSISTers are doing in the field and explore other avenues of presentation, communication and discussion via social networking and related online social spaces. more...