By engineering a parasitical virus, geneticists have taken the first steps toward creating a biological internet in which the body's processes can be improved by controlling the natural communication abilities of cells. Using the M13 virus, Stanford researchers have created a mechanism to send genetic messages from cell to cell. "The system greatly increases the complexity and amount of data that can be communicated between cells and could lead to greater control of biological functions within cell communities."
Archive for the ‘Computational Science’ category
As promised, it is time to mention the most interesting person I had the chance to talk with at the NAGARA/CoSA Conference in Santa Fe last week. I guess when you ask the right questions “they” will come! By they I mean the smart people… After one of the sessions, Mark Conrad an Archives Specialist working with the Applied Research Division (Office of Information Services) of the National Archives and Records Administration (NARA) approached me. He said “aren’t you the one asking about open source solutions?” But of course I was the one! I was so excited to here that NARA is going there!!! I also had the chance to attend a session titled ISO 16363 Audit and Certification of Trustworthy Digital Repositories. The session was delivered by Mark and Technology Specialists from Kentucky. This “Archives Specialist” slash technical guru immediately started rattling off a list of tools and projects that I should take a closer look at. Using his tricked out iPad he started prompting his screen to pop my top. Mark works in the Center for Advanced Systems and Technologies (NCAST). In his position with NARA, he works with computer scientists and engineers from all over the world “to leverage new theories, knowledge, methods, and techniques to advance the lifecycle of electronic records.” Part of the mission of his division includes looking into “emerging technologies.” I must say I about did a back flip when Mark pulled up images of a Visualization Lab in the works. Simply mind blowing! There it was— a virtual filing cabinet. As an archivist, I would be able to process or arrange and describe electronic records by using my fingers and a touch screen. Yes- a touch screen- a virtual system used to arrange collections and sort data- with color codes and all. The volume of records in a particular series is proportional to the amount of data within a particular sector of the collection. In January of 2011, the web administrator of NARAtions: The Blog of the United States National Archives interviewed Mark Conrad. She asked him what he was working on and he said “with the assistance of 17 student interns, I am collaborating on a number of projects. For example, many of the students are currently loading large numbers of files into a testbed that is being used by the computer scientists working on the CI-BER project. The purpose of the project is to provide insights into the management of very large data collections. As the number of files and bytes in a collection goes up some of the systems used to manage the collection break down. This project will help us to identify some of the bottlenecks and look for better ways to build systems that don’t break down as the volume picks up.” He also said he was working with the “Department of Energy, NIST, Naval Sea Systems Command, Army Research Lab, and other Federal Agencies on ways to share information about current and emerging practices for managing and preserving engineering data for as long as it is needed.” Sometimes I am glad that I ask a grippa questions— if I didn’t care about open source solutions, I would have never met one of the most interesting archivists with a technical background ever.
*****Now that’s what I’m talkin’ ’bout!!!!!***** I wish I could be a kid in the UK… It’s nice to see the educators there are ready to nerd up. That’s where it’s at! It would be nice to see USA educators do the same, and join the open source revolution!
$35 Computer Goes on Sale
Published by Orion Jones on March 1, 2012 via BigThink at http://bigthink.com/ideafeed/35-computer-goes-on-sale
What’s the Latest Development?
The $35 Raspberry Pi computer, which is now on sale to the public, has been met with extremely high demand. The computer is sold without a keyboard or monitor and is mainly a product of the English academy and the UK tech industry. With ports for a mouse, keyboard and a high-speed internet cable, the device can be connected to any computer monitor. “Massive demand for the computer has caused the website of one supplier, Leeds-based Premier Farnell, to crash under the weight of heavy traffic.”
What’s the Big Idea?
The Raspberry Pi Foundation envisions that the device, which runs on the open source platform Linux, will be used to teach new generations of school children how to program computers. Its release comes at a time when the UK is considering shifting the direction of its national education agenda to emphasize computer programming skills, which many consider essential in today’s world. Although the Foundation wanted the device to be made in the UK, the computer will be assembled in China. A $25 version will go on sale later in the year.
Photo credit: wikimedia commons
11*****Posted using WordPress for BlackBerry*****11
If you are one of the countless people using Symantec enterprise products, this information is for you. It is now confirmed that the Symantec source code was snatched up by a hacker with the code name “Yama Tough” in late January. I read several reports tonight, and it is unclear if the entire source code has been released, or if parts of the code have been released? What is clear is that at least some of the code has been posted to file sharing sites such as BitTorrent. This could compromise your private data if you use Symantec. Some of the products at risk include: Norton Internet Security; Norton Antivirus Corporate Edition; Systemworks; and PCAnywhere. Before releasing the source code through file sharing programs, “Yama Tough” asked for 50 grand to keep the code secret. Apparently the hacker was unhappy with the corporation, and started to share the code. You may want to switch to a new product until all this is ironed out if you are indeed using Symantec products.
11*****Posted using WordPress for BlackBerry*****11
***Apple intern’s thesis leaks secret project to port Mac OS X to ARM processors***
Originally published online via Apple Insider on 2.7.2012 by Josh Ong
An academic paper written by a former Apple intern who now serves as a Core OS engineer at the company has revealed that it was working on a secret experiment to port Mac OS X Snow Leopard to the ARM architecture.
In 2010, Tristan Schaap published a Bachelor thesis on his 12 week stint as an intern with Apple’s Platform Technologies Group, a subdivision of the Core OS department. The thesis was originally embargoed because it contained sensitive information, but it was eventually published by the Netherland’s Delft University of Technology several months ago, as reported by iMore.
According to the paper, Schaap worked with the group to get Darwin, the “lower half” of Apple’s Mac OS X operating system, to boot onto an ARM processor from Marvell. During the course of the project, he achieved his goal of “booting into a multi-user prompt,” though some issues still remained due to a “poor implementation on the debug hardware.”
It is, however, highly possible that Apple’s explorations into porting Mac OS X to the ARM architecture were not meant to ever ship in an actual product. The company has been known to place new engineers on decoy projects in order to determine their trustworthiness.
But, it is interesting to note that, according to Schaap’s LinkedIn profile, he joined Apple as a “CoreOS Engineer” after graduation and has worked there for almost a year and a half. His profile lists his 2009 intern position as an “Embedded Bringup Engineer.”
Schaap wrote in his thesis that he faced three technical issues during the 12-week project. Having to create a build system, including a filesystem and kernelcache, from the ground up was one of the obstacles. A stale kernel source was also a problem, since bugs snuck in due to the ARMv5 branch of XNU not having been exercised “in a long time.” Finally, Schaap said issues with the JTAG debugger resulted in an “entire instruction set” being unusable.
In order to get the product ready to ship, Schaap noted that the L2 cache would need to be reworked. Several more drivers would also need to be written for the hardware in order to “fully utilize the potential.” Also, Schaap recommended that several applications be written or ported from other platforms since the userland the team had ported was “not enough to perform the tasks the unit needs to perform.”
Though rumors that Apple has been interested in switching from Intel-based Macs to ARM-based ones have been around for some time, one analyst poured cold water on that likelihood last week after a meeting with Apple CEO Tim Cook. Citi’s Richard Gardner said he walked away from the meeting “with the impression that Apple feels iPad satisfies–or will soon satisfy–the needs of those who might have been interested in such a product” as an ARM-based MacBook Air.
Misek had previously predicted that Apple would being merging Mac OS X and iOS this year with the release of an A6-powered MacBook Air. Last May, a rumor surfaced that Apple had built a test MacBook Air with the same ARM-based A5 processor that was used in the iPad 2. Company executives reportedly felt the prototype performed “better than expected.”
Speculation that Apple would port OS X to ARM has also been fueled by the fact that Microsoft announced early last year that Windows 8 will run on the ARM architecture. However, Microsoft’s strategy differs from Apple in that it is making plans for tablets with a full desktop operating system accompanied by a Metro UI layer on top that is optimized for touch. For its part, Apple has itself preferred to take inspiration from the iPad and bring it back to the Mac, rather than the other way around.
From PowerPC to Intel
Apple spent years preparing for the last major architecture switch on the Mac: the move from PowerPC to Intel. In fact, former executives revealed that the company’s failed effort to port Mac OS to Intel was one of the circumstances that brought co-founder Steve Jobs back to the company. The failure apparently made it clear to Apple that it needed to modernize its operating system, so it decided to purchase NeXT, which Jobs had founded after leaving Apple, to do so.
Jobs went on to accomplish the company’s goals, first modernizing Mac OS in 2001 with the release of Mac OS X and then announcing the switch to Intel in 2005. Parallel Intel-compatible versions of Mac OS X existed alongside the official PowerPC variants for five years prior to the switch, as Jobs reportedly had wanted to go with Intel back then, though he ultimately decided to adopt the G5 processor.
Yet another one of my dream jobs would be to work for the Defense Advanced Research Projects Agency (DARPA). The agency is part of the United States Department of Defense. These knowledge eaters spit fire and develop mind-blowing computer technology for the military. The agency also has a slick mirrored building as the “headquarters” in Arlington, Virginia. Now check out this hot news. Wow!!! I am amazed by this!!! This is real and unbelievable- really… I am due for an eye appointment- wonder if Dr. B can fit me for some of these puppies???
DARPA researchers design eye-enhancing virtual reality contact lenses
Originally published online by DARPA on January 31, 2012
Currently being developed by DARPA researchers at Washington-based Innovega iOptiks are contact lenses that enhance normal vision by allowing a wearer to view virtual and augmented reality images without the need for bulky apparatus. Instead of oversized virtual reality helmets, digital images are projected onto tiny full-color displays that are very near the eye. These novel contact lenses allow users to focus simultaneously on objects that are close up and far away. This could improve ability to use tiny portable displays while still interacting with the surrounding environment.
Developed as part of DARPA’s Soldier Centric Imaging via Computational Cameras (SCENICC) program, SCENICC’s objective is to eliminate the ISR capability gap that exists at the individual Soldier level. The program seeks to develop novel computational imaging capabilities and explore joint design of hardware and software that give war fighters access to systems that greatly enhance their awareness, security and survivability.
Please direct all media queries to: DARPAPublicAffairsOffice@DARPA.mil
11*****Posted using WordPress for BlackBerry*****11
Today (technically yesterday- it’s after midnight) I made an extraordinary discovery! I was interested in the exact time of the sunset and sunrise so that I could capture a devoted look over the weekend. In my search for information on this, I located the web site of the Astronomical Applications Department of the U.S. Naval Observatory.
Data services on the site include: the complete sun and moon data for one day; a rise/set/twilight table for an entire year (including the times for major solar system objects and bright stars); what the moon looks like right now; the dates of the primary moon phases; altitude (height) and azimuth (angle) of the sun and moon; information about the day and night across the earth; information on equinoxes and solstices; as well as date and calendar conversion charts.
There are also several computers available (solar eclipse, lunar eclipse, and transits of Mercury and Venus). One of the coolest things I found on this site was the Astronomical Information Center. There is a section devoted to astronomical phenomena and celestial navigation. This section features a computing almanac for major solar system bodies and navigational stars.
I adore research sites!! This site is way cool!!
You have to check it out…
Astronomical Applications Department of the U.S. Naval
The Open Planets Foundation sent me this information today… In my dreams I can attend this symposium! Talk about a ***dream job***… The keynote speaker Dr. Jerome McDonough works with the iSchool at the University of Illinois. He is part a project called “Preserving Virtual Worlds.” Then there is presenter Tom Woolley. Woolley is a Curator of New Media at the National Media Museum in the UK. Should I cry now or later??? For anyone out there who can attend this event, take notes for me ok?! It is sure to be a thought provoking symposium.
*********** Sent: Thu 12/15/2011 2:58 AM***********
I am writing to let you know that bookings are now open for the next POCOS Symposium, on the issues relating to the long-term preservation of Computer Games and Virtual Worlds. This Symposium will be held in The Cardiff Novotel, where POCOS have arranged special discount rates on rooms, together with free delegate parking and wifi. To take advantage of this discount, simply contact the hotel (address above) and tell them that you are attending the POCOS event. The event will be free to attend, but you are asked to make a contribution of £10 towards the cost of coffee and lunches on the two days. Preservation of video games and virtual worlds presents challenges on many fronts, including complex interdependencies between game elements and platforms; online, interactive and collaborative properties; and diversity in the technologies and practices used for development and curation. This exciting two-day symposium will provide a forum for participants to discuss these challenges, review and debate the latest developments in the field, witness real-life case studies, and engage in networking activities.
The symposium will promote discussion on such topics as:
***Implications and advances in preserving video games and virtual worlds
***Issues of recreating complex technical environments in terms of mods, cracks, plug-ins, joysticks etc. for both console and PC games
***The overriding need to provide an authentic user experience for preserved games
***The Economical Case for re-releasing old games
***Legal and Ethical issues in collecting, curating and preserving virtual worlds
***Interpretation and Documentation, especially metadata
Keynote Speakers and presenters include:
***Dr Jerome McDonough – The iSchool, University of Illinois, USA / Preserving Virtual Worlds Project
***Prof. Richard Bartle FRSA – University of Essex, UK and creator of MUD1
***Dr Dan Pinchbeck – TheChineseRoom, UK and creator of ‘Dear Esther’
***Tom Woolley – Curator of New Media, National Media Museum, UK
***Further speakers have been invited and confirmation of attendance is awaited.
The programme also includes break-out sessions for participants to discuss key topics in the preservation of games and virtual worlds. You can download the event brochure at: http://www.pocos.org/images/pub_material/POCOS_3_LEAFLET_V1.pdf For more information, please visit the POCOS page at:http://www.pocos.org/index.php/pocos-symposia/software-art
My friend Norma works with the Chemistry and Metallurgy Research Replacement Project – Los Alamos National Laboratory. Yesterday she sent me a super cool email with this small article from the lab, and I wanted to share it… I loved this- so very interesting! Computational science, code, stars, spirals, what’s not to love? Unfortunately, I couldn’t get to the LANL news release on this study because it required a login, but I will ask one of my parents to look this up for me. I want to see if there is more information on the computational code used to study the star collision. Hummm?
Christmas Burst Reveals Neutron Star Collision
December 6, 2011
Old model, new data: a match made in the heavens. A strangely powerful, long-lasting gamma-ray burst on Christmas Day, 2010 has finally been analyzed to the satisfaction of a multinational research team. Called the Christmas Burst, GRB 101225A was freakishly lengthy and it produced radiation at unusually varying wavelengths. But by matching the data with a model developed in 1998, the team was able to characterize the star explosion as a neutron star spiraling into the heart of its companion star. The paper titled, “The unusual gamma-ray burst GRB 101225A from a helium star/neutron star merger at redshift 0.33,” appeared in a recent issue of the journal Nature. Christina Thöne of Spain’s Instituto de Astrofísica de Andalucía is the lead author, and Los Alamos computational scientist Chris Fryer is a contributor. Fryer, with the Lab’s Computer, Computational, and Statistical Sciences Division, realized that the peculiar evolution of the thermal emission (first showing X-rays with a characteristic radius of ~1011 cm followed by optical and infra-red emission at ~1014 cm) could be naturally explained by a model he and Stan Woosley of the University of California at Santa Cruz had developed in 1998. “The Helium Merger Model explained all the properties we were seeing,” Fryer said, although he noted that proving this required a series of additional computational models by the international theory team studying this “Christmas burst” and the work is still under way. Fryer is working with Wesley Even of the Los Alamos X Theoretical Design Division, using the U.S. Department of Energy’s Advanced Simulation and Computing Codes to study the emission of this burst in more detail.
For more information, see the LANL news release.