摘要 :
Digital media consumes a large and growing share of our waking lives, but these goods and services go largely uncounted in GDP. That's because the measure is based on what people pay for goods and services. If something has a pric...
展开
Digital media consumes a large and growing share of our waking lives, but these goods and services go largely uncounted in GDP. That's because the measure is based on what people pay for goods and services. If something has a price of zero, then it typically contributes zero to GDP.
收起
摘要 :
In the mid-twentieth century, businesses around the world began to see technical know-how as one of the most important assets they could possess. While their exact definitions of know-how varied (usually centering on employees' ta...
展开
In the mid-twentieth century, businesses around the world began to see technical know-how as one of the most important assets they could possess. While their exact definitions of know-how varied (usually centering on employees' tacit knowledge; accumulated, minor innovations rather than just patentable inventions; and tailoring to local conditions), the rapidly growing perception that it was invaluable led to widespread know-how licensing. As businesses embraced it, legal scholars and business lawyers during the 1950s through the 1970s scrambled to clarify legal bases for intellectual property protections for know-how. In the 1970s Supreme Court decisions undermined this effort, and a consortium of legal organizations turned instead to lobbying for statutory protection for the related, narrower category of "trade secrets." Despite the rise and relative decline of know-how in American business and law, interest in the term spread to other languages and legal systems, and the repercussions of these shifting understandings of technology transfer remain with us today.
收起
摘要 :
The miniaturization of semiconductor transistors has driven the growth in computer performance for more than 50 years. As miniaturization approaches its limits, bringing an end to Moore's law, performance gains will need to come f...
展开
The miniaturization of semiconductor transistors has driven the growth in computer performance for more than 50 years. As miniaturization approaches its limits, bringing an end to Moore's law, performance gains will need to come from software, algorithms, and hardware. We refer to these technologies as the "Top" of the computing stack to distinguish them from the traditional technologies at the "Bottom": semiconductor physics and silicon-fabrication technology. In the post-Moore era, the Top will provide substantial performance gains, but these gains will be opportunistic, uneven, and sporadic, and they will suffer from the law of diminishing returns. Big system components offer a promising context for tackling the challenges of working at the Top.
收起
摘要 :
Data-dependent acquisition (DDA) and data-independent acquisition strategies (DIA) have both resulted in improved understanding of proteomics samples. Both strategies have advantages and disadvantages that are well-published, wher...
展开
Data-dependent acquisition (DDA) and data-independent acquisition strategies (DIA) have both resulted in improved understanding of proteomics samples. Both strategies have advantages and disadvantages that are well-published, where DDA is typically applied for deep discovery and DIA may be used to create sample records. In this paper, we present a hybrid data acquisition and processing strategy (pSMART) that combines the strengths of both techniques and provides significant benefits for qualitative and quantitative peptide analysis. The performance of pSMART is compared to published DIA strategies in an experiment that allows the objective assessment of DIA performance with respect to interrogation of previously acquired MS data. The results of this experiment demonstrate that pSMART creates fewer decoy hits than a standard DIA strategy. Moreover, we show that pSMART is more selective, sensitive, and reproducible than either standard DIA or DDA strategies alone.
收起
摘要 :
Massive open online courses (MOOCs) are often characterized as remedies to educational disparities related to social class. Using data from 68 MOOCs offered by Harvard and MIT between 2012 and 2014, we found that course participan...
展开
Massive open online courses (MOOCs) are often characterized as remedies to educational disparities related to social class. Using data from 68 MOOCs offered by Harvard and MIT between 2012 and 2014, we found that course participants from the United States tended to live in more-affluent and better-educated neighborhoods than the average U.S. resident. Among those who did register for courses, students with greater socioeconomic resources were more likely to earn a certificate. Furthermore, these differences in MOOC access and completion were larger for adolescents and young adults, the traditional ages where people find on-ramps into science, technology, engineering, and mathematics (STEM) course work and careers. Our findings raise concerns that MOOCs and similar approaches to online learning can exacerbate rather than reduce disparities in educational outcomes related to socioeconomic status.
收起
摘要 :
This paper considers compensation of anticipated erasures in a discrete-time (DT) signal such that the desired interpolation can still be accomplished, with minimum error, through a linear time-invariant (LTI) filter. The algorith...
展开
This paper considers compensation of anticipated erasures in a discrete-time (DT) signal such that the desired interpolation can still be accomplished, with minimum error, through a linear time-invariant (LTI) filter. The algorithms presented may potentially be useful in the compensation of a fault in a digital-to-analog converter where samples are dropped at known locations prior to reconstruction. Four algorithms are developed. The first is a general solution that, in the presence of erasures, minimizes the squared error for arbitrary LTI interpolation filters. In certain cases, e.g., oversampling and a sinc-interpolating filter, this solution is specialized so it perfectly compensates for erasures. The second solution is an approximation to the general solution that computes the optimal, finite-length compensation for arbitrary LTI interpolation filters. The third is a finite-length windowed version of the oversampled, sinc-interpolating solution using discrete prolate spheroidal sequences. The last is an iterative algorithm in the class of projection onto convex sets. Analysis and results from numerical simulations are presented.
收起
摘要 :
Recent technical advancements have facilitated the mapping of epigenomes at single-cell resolution; however, the throughput and quality of these methods have limited their widespread adoption. Here we describe a high-quality (105 ...
展开
Recent technical advancements have facilitated the mapping of epigenomes at single-cell resolution; however, the throughput and quality of these methods have limited their widespread adoption. Here we describe a high-quality (105 nuclear fragments per cell) droplet-microfluidics-based method for single-cell profiling of chromatin accessibility. We use this approach, named 'droplet single-cell assay for transposase-accessible chromatin using sequencing' (dscATAC-seq), to assay 46,653 cells for the unbiased discovery of cell types and regulatory elements in adult mouse brain. We further increase the throughput of this platform by combining it with combinatorial indexing (dsciATAC-seq), enabling single-cell studies at a massive scale. We demonstrate the utility of this approach by measuring chromatin accessibility across 136,463 resting and stimulated human bone marrow-derived cells to reveal changes in the cis- and trans-regulatory landscape across cell types and under stimulatory conditions at single-cell resolution. Altogether, we describe a total of 510,123 single-cell profiles, demonstrating the scalability and flexibility of this droplet-based platform.
收起