Archive for the ‘Computational Science’ category

Artificial Intelligence, Deep Neural Networks and Deep Learning: Oh My!!

May 26, 2015
image

♥In Love with Technology♥

I can’t explain how much I love learning about technological breakthroughs. I’m not scared in the least bit by much when it comes to the forefront of intellectualism. I recently read a hard copy article in the May 2015 issue of The Economist titled Artificial intelligence:
Rise of the machines. The byline says… “artificial intelligence scares people—excessively so?” Really? What a bunch of wimps!! People continue to fight enlightenment, progression and change to stay in boxes they have built. Not me.

At a speech in October 2014 at the Massachusetts Institute of Technology, a scholar said that artificial intelligence (AI) was “summoning the demon.” People are paranoid that machines will take over in employment’s race for productivity. With industry powerhouses like Google and Amazon buying AI start-up companies, maybe human worries are justified? We will just find other jobs right? It’s called perseverance.

image

•Photo I snapped while reading the article in the library.•

Will computers continue to replace some of the things that people normally do? Probably. I loved this quote. “The torrent of data thrown off by the world’s internet-connected computers, tablets and smartphones, and the huge amounts of computing power now available for processing that torrent, means that their algorithms are more and more capable of understanding languages, recognizing images and the like.” Why didn’t I visit the San Diego Supercomputer Center many years ago when I had the chance? I also could have ditched my conference last October to go there! Now that would have been a real memory to cherish!

The article in The Economist said “signs of the AI boom are everywhere.” Google recently paid $400 million for DeepMind. Have you ever heard of DeepMind? If not, you should so check it out!! Pure awesomeness if you like video games. Just Google it and see. There is also a great article in The New Yorker which discusses how deep neural networks operate. Deep neural networks are used by companies like DeepMind. These artificial networks are much like the neural networks in the human brain. It is amazing to read about.

The newest form of AI tied to deep neural networks is now capable of “deep learning!” Computers can learn through the analysis of large amounts of data using algorithms. Freak out on the algorithm Facebook recently deployed. Did you think you were anonymous in that untagged photo? Think again… DeepFace “can recognise specific human faces in images around 97% of the time, even when those faces are partly hidden or poorly lit.” I want to be that smart and write programs like this. It’s not fair!! Male engineers created DeepFace and I give them tons of respect, but why are intelligent women often seen as  domineering? That’s not fair either.

image

•Smart Woman Army•

Another thing I found interesting in the article was that since most data is labeled by humans, and algorithms need that data to learn better, another race is on. It is a race to develop “unsupervised-learning” algorithms. This way, the need for human labeling is basically eliminated. How accurate will it be? I guess we will see. Artificial neural networks were invented in the 1950s by people with big brains who wanted bigger, faster, more accurate brains! I lovvvvve brains!! I am so not turned off by them!! Haha… These smart people were simulating the neurons and electrochemicals in a human brain to create artificial intelligence. It worked!!

image

♦"Just watch! Imma make my perfect woman!" (Dr.J before the chemical waste accident that birthed The Joker and this is not Harley Quinn)♦

If you are a brave fellow intellectual and enjoy all things mind blowing, you should read the article in The Economist. It so so worth the read. You can also learn about the interesting problem with AI. Do you know the one thing people can immediately identify that a computer simply can’t define? Porn… Yes… pornography. I guess machines provide plenty of access to porn, but don’t ask a damn machine to intelligently recognize porn lol. We can leave that type of analysis to the humans!

Vu Digital Translates Videos Into Structured Data

May 4, 2015

This is awesome! It basically extracts descriptive metadata!!! Nice!! Useful for sure!
~~~Felicia

Unique Identifiers: A Closer Look at Biometric Technology in New Mexico

December 3, 2014
Biometrics_by Felicia Lujan_December2014

|Biometrics~ A digital composite by Felicia Lujan. This composite is composed of 13 layers, 8 masks, 3 color overlays, and a Gaussian blur. The composite includes images of binary code and components of ocular, palm vein, and voice recognition scans.|


**NOTE: This research was
not intended to promote or
renounce the use of biometric
systems, though I do find the
technology extremely interesting
and useful in most cases. I
understand that the use
of this technology is considered
controversial by some. I intend
to continue my exploration into
how biometric technology is
being used around the world
for the greater good.

________________________________
I am an archivist with a deep love of technology, which is one reason I pursued a masters level certification in digital information management. A little over a week ago, I was in a meeting that reignited my interest in biometrics. I must admit that I was naïve in my assumption that my state was not a pioneer in this industry. First off, I didn’t know that the central nervous system of New Mexico state government (aka the State Data Center at the Department of Information Technology) utilizes biometric technology as a method of security. After that meeting I came home curious about how involved New Mexico is when it comes to biometric research and implementation. The writer, the researcher, the analyst, the special agent in me took over and that night I added biometric engineer to my list of dream jobs that I would love to have. So…what type of education does a biometric engineer need? Most commonly, a biometrics engineer has: a computer science degree; a computer language certification like Java or C++; and good problem-solving, people, and technical skills.

I found an informative link online titled “Become a Biometrics Engineer: Education and Career Roadmap.” Hum? Well, according to this plan, there are only 7 “popular schools” specializing in advancing a career in biometrics. The page said that “biometric technologies include complex equipment designed to analyze personal identification markers unique to each individual, such as fingerprints, ear lobes, vein patterns, voices, and iris shapes.” Through this research, I discovered that the technology is not limited to “individuals” or people here in New Mexico. I did know that biometric engineers were software developers, but there was a lot that I didn’t know before I embarked upon this research over the Thanksgiving break. Ear lobes? Veins? Hum? Didn’t know those were used as unique identifiers? We are all well aware of the TV shows touting the sexy use of biometrics, like CSI and most recently my beloved Scandal, but that’s just on TV right? A dead guy’s index finger couldn’t possibly be used to confirm his identity? Could it Shonda? Maybe I should ask Chien Le?

The most information dense white paper I discovered was written by Chien Le of the Department of Computer Science and Engineering at Washington University in November of 2011. Le wrote A Survey of Biometrics Security Systems and his research introduced biometric security systems. It also outlined application fields for biometric technologies, solutions, middle-ware and software, advantages and disadvantages, acronyms, and the future uses of biometrics. Damn! Chien Le beat me to the punch didn’t he?! Here it was…all laid out for my thirsty mind. Le’s paper says there are “seven basic criteria for biometric security systems.” These are “uniqueness, universality, permanence [hummm?? Do I hear digital preservation?], collectability, performance, accessibility and circumvention.” I don’t completely understand some of the criteria, but it was very useful to read over the types of biometric solutions outlined by Le. Current technologies include: facial recognition detectors, fingerprint readers, voice recognition, iris scanners, vein recognition, DNA biometric systems, and 2D barcode scanners, among others.

This technology can have good uses, but there are many privacy advocates who are against the use of any biometrics. In December of 2013, Scientific American published Biometric Security Poses Huge Privacy Risks by Oliver Munday with a byline which read “without explicit safeguards, your personal biometric data are destined for a government database.” The article starts with the sentence “security through biology is an enticing idea.” Yeah it is. Is that all it is though? An idea? I think not. Maybe I’m not worried about privacy as much as I should be? The article is basically a call to United States Congress for “lasting protections against the misuse of biometric data.” Munday quoted an attorney with the Electronic Frontier Foundation who seems to fear that biometric data will be used genetically to test for criminal predisposition. I’m actually not sure that’s a bad thing? I guess my only concerns at this point would be relative to health information and insurance coverage. When it comes to physical security and data security, personally, I think that biometric technology is necessary. It is a way to uniquely protect data, which in the end equals the preservation of knowledge and heightened security.

Over the weekend I started whittling through what I found. I read a great deal of articles and a few white papers before I started to look at projects going on closer to home. The more I researched this topic, the more information I found. I was most interested in how biometric systems actually work, so I focused my mind on the technical aspects. I had questions like…what are the major components of a biometric system? Who uses these systems? One of my questions was answered in Le’s paper. I have a sore throat now, so last night I wondered…what if a person needed to use voice recognition and something was wrong with their voice? How is that accounted for in designing a successful system? According to Le, there was no solution. A voice recognition system will not recognize a hoarse voice wave. So now that we have some background on the basics of biometrics, let’s take a look at what I found going on right here in my state. I was able to locate information on at least ten concrete areas where biometric technology is being used in New Mexico from at least 2003-2014. I’m sure there are many projects I missed, but frankly, this could be a thesis and maybe even a dissertation. This is just a quick look at highly visible projects I came across over the last week.

We will start with the New Mexico Department of Information Technology (DoIT) since it is a meeting with this office that rekindled my interest in this technology. DoIT is “responsible for infrastructure IT services provided 24x7x365 which includes: the State’s telecommunications system, two-way public safety radio, digital microwave, the State’s core data network and internet connectivity, and the State’s Data Center.” It is here, in the State Data Center where biometric technologies are being used for data security. I felt impressed with my state when I learned that and tomorrow I will get a tour of the center. “The State’s Data Center provides a secure facility with redundant power and cooling which houses many of the State’s critical IT systems including the State’s mainframe and agency servers. This division also provides enterprise system services which include the State’s consolidated email system…” It will be interesting to see what type of biometric security the agency is using as of late. I am guessing a finger or palm scanner?

The two strangest projects I found information on were tied to the use of biometrics on kids and animals in New Mexico. On April 3, 2013, there was a news release put out by KOAT (channel 7) titled Los Lunas School Offers Biometric Scans at Lunch. What? Seriously? Yes. Seriously. The school apparently tried to implement a palm vein scanner in the lunch room instead of good old meal tickets or cards. Parents were not happy about the suggestion of using infrared wavelengths (electromagnetic radiation) during the lunch hour to ID their children. The parents fought off the proposal which would have allowed scanners to recognize a unique vein pattern in the child’s palm and they won. I wasn’t sure which seemed stranger…scanning kids or scanning animals? I also read about how the New Mexico livestock industry is using Retinal Vascular Pattern (RVP) for livestock identification. RVP is the pattern of blood vessels at the back of the eye. It’s is being called the new way of branding animals. I wonder how ranchers feel about that since they must prefer the old burn and freeze methods? What’s a brand without cowboy symbology right?

I discovered that the national labs and the air force bases are also using biometrics. Of course, this was no surprise. I read a white paper Chris Aldridge prepared for Sandia National Laboratories and Lawrence Livermore National Laboratory in June of 2013. Sandia Report No. SAND2013-4922 is titled Mobile Biometric Device (MBD) Technology: Summary of Selected First Responder Experiences in Pilot Projects. This report was concentrated on the use of MBDs to enroll individuals in databases and perform “identification checks of subjects in the field area,” for “military, law enforcement, and homeland security operations.” The report was a multi-agency/multi-state project with 3M Cogent Systems and involved: Iowa, Colorado, California, D.C., Texas, Washington (Seattle), Arizona, Virginia, West Virginia, Illinois, Wisconsin, Arkansas, and Idaho. I think the most interesting part of this study used a “mock prison riot” for first responders out of West Virginia. We all know how critical that information is given New Mexico’s prison riot history. Many of the agencies studied for this report are using “Fusion devices.” Fusion was developed by 3M Cogent Systems for the Department of Defense. A large part of studies in this field are tied to law enforcement, but currently the technology trend is leaning towards cyber security.

The National Institute of Standards and Technology (NIST) says biometrics are important because they: secure facilities, protect access to computer networks, counter fraud, screen people at our borders, and fight crime. The NIST says this technology is used to manage identities for: first responders at the scene of a natural disaster, border patrol, soldiers in theater, and police officers on the street. It makes sense that the following projects are closely related to the projects cited in the Sandia report. In New Mexico, the Federal Bureau of Investigation (FBI) uses the Combined DNA Index System (CODIS) to support criminal justice DNA databases. The National DNA Index System or NDIS is part of CODIS. The FBI uses biometrics to analyze data from DNA databases and for latent print analysis. Holloman Air Force Base is using the 49th Security Forces Defense Biometric Identification System which is comprised of hand-held scanners. The scanners are used to screen people entering the base to verify the access authorization. Identity is established using barcode technology and fingerprints. In February of 2011, it was announced that Santa Fe County was using biometrics to “remove aliens convicted of a crime.” It can also be noted that between 2003 and 2005, the National Academy of Engineering (NAE) researched the use of biometrics in handgun grips while working with a New Mexico biometrics company. The NAE was interested in developing biometric grip sensors, but a 2005 report declared the tests a failure.

I also located evidence of the health care systems in New Mexico using biometric technology. The University of New Mexico Hospital (UNMH) offers Biometrics Screening Services as part of Employee Health Plans. These screenings are said to align with recommendations of the United States Preventive Services Task Force (USPSTF). Ommmm…Maybe this is where my privacy fears rest? In 2013, the American College of Occupational and Environmental Medicine released a Joint Consensus Statement on Biometric Health Screening for Employers. According to the “statement,” the United States Center for Disease Control and Prevention defines biometric screenings as “the measurement of physical characteristics such as height, weight, BMI, blood pressure, blood cholesterol, blood glucose, and aerobic fitness that can be taken at the worksite and used as part of a workplace health assessment to benchmark and evaluate changes in employee health status over time.” I am a fitness freak, but that seems crazy? What if something is wrong with me and I don’t know? The statement outlines the “purpose of screenings” and I found it kind of scary. What if they find out I experience shortness of breath or I’m genetically predisposed to cancer? Will they drop me from my insurance plan?

In New Mexico health circles, I also located a “Fingerprint Techniques Manual,” which was prepared by the New Mexico Department of Health. The manual had very interesting graphic illustrations on the fundamentals of fingerprints. This training tool covered from patterns to arches to loops to lines to deltas to cores to whorls to scars of the fingerprints. The machines can read all these intricate things. The Division of Health Improvement uses this technology as part of the Caregivers Criminal History Screening Program. Makes more sense than the biometric screenings. I feel comfortable with this use. This type of use can protect people from abuse or other forms of criminal activity. I was rather impressed with the 36 page manual. It reminded me that about 15 years ago I applied for a finger print technician position with the Department of Public Safety. I was crushed to learn that these people don’t make very much. I don’t know…I guess you have to be a biometrics engineer to make it out there!? What I do know is that I found a great deal of information about how New Mexico is actively participating in the biometric industry.

I gained useful knowledge through this research into biometrics and then regurgitating what I learned. My son just asked me what I was writing about and when I told him he looked at me with the curiosity that I love and see in myself. I told him “I’m writing about biometrics. Do you know what that is?” I explained with words and then decided it was easier to show a nine year old a catchy tech video with visual candy. Together we learned about the future of biometric systems. Between October and November of this year there were several videos on the use biometric technology. The National Science Foundation released information on a project by a young man studying the use of ocular biometrics in the video game industry for disabled people. In October the Telegraph out of the United Kingdom released a video declaring that we would simply kill passwords with biometrics and CBS news declared that biometric palm scans will help keep hospitals secure.

The future of biometrics is here. It is everywhere and happening all around us. Biometrics is about identifying who we are and not who we say we are. Tonight I learned that the most accurate method for a biometric reading is the heartbeat or an electrocardiogram (ECG). Makes sense ha? It’s symbolic actually. Symbolic because the heart is at our biometric core. It is the giver of life. The heart represents how we feel and who we are. That beat is indeed is a unique identifier.


Sources:

News release, Santa Fe County and All New Mexico Now Benefit from ICE Strategy to Use Biometrics to Identify and Remove Aliens Convicted of a Crime, released on ice.gov, February 15, 2011

White paper, A Survey of Biometrics Security Systems by Chien Le, Department of Computer Science and Engineering, Washington University, November 28, 2011

News release, Los Lunas School Offers Biometric Scans at Lunch, released on koat.com, April 3, 2013

White paper, Mobile Biometric Device (MBD) Technology: Summary of Selected First Responder Experiences in Pilot Projects by Chris Aldridge, Sandia Report No. SAND2013-4922, prepared by Sandia National Laboratories and Lawrence Livermore National Laboratory, June 2013

Article, Biometric Security Poses Huge Privacy Risks by Oliver Munday, released on scientificamerican.com, December 17, 2013

Publication, Fingerprint Techniques Manual, prepared by New Mexico Department of Health, Division of Health Improvement, Caregivers Criminal History Screening Program, no date

Various internet searches for basic information in articles and videos

IO launches an OpenStack cloud running on open source servers

February 1, 2014

~~~Awesome. •••Felicia

Gigaom

Modular data center expert IO is getting into the cloud provider business, launching a new service called IO.Cloud that’s built using Open Compute server designs and runs the OpenStack cloud computing operating system.

That’s a lot of open source, but the company seems to think it’s necessary. According to the IO.Cloud website: “IO.Cloud is built on Open Compute because it provides our engineers with the flexibility to configure and optimize the hardware specifically for scale cloud deployments … IO.Cloud uses OpenStack Cloud components that are interoperable and designed to support standardized hardware implementations.”

IO is pitching IO.Cloud as an enterprise cloud offering, and if it plans to legitimately compete against larger cloud providers for those workloads, the company and its cloud can use any advantages they can get. IO.Cloud is available in hosted and on-premises versions, and the Open Compute hardware almost certainly will let IO operate its public cloud infrastructure more efficiently, as well…

View original post 421 more words

Death of the Algorithm

July 2, 2013

It is so strange and other~ish that it
becomes a stream~of~consciousness
algorithm unto itself~
something almost inhuman.”

~ Jerry Saltz ~
(American Art Critic)


An algorithm is used in mathematics as a detailed procedure for calculations. Algorithms are used for processing data, as well as for automating reasoning. The Scarf Algorithm has been used by researchers for stable matching and the balance of core elements. Herbert Scarf is a Sterling Professor Emeritus of Economics at Yale University. In 1981, he produced this algorithm for integer programming and the calculation of nonlinear complementarity problems. He has also used his model in the Mathematics Genealogy Project with North Dakota State University. He has identified 138 descendants.

Death of the Algorithm by Felicia Lujan

……….~~…Death of the Algorithm…~~……….
Digital composite by Felicia Lujan
Includes five layers: my photograph; an image of a skull; an image of the Scarf Algorithm; and two color layers.

October 5, 2012

This is absolutely amazing!!!!
I love this… ~F

Virtually Pop Your Top

July 24, 2012

A virtual collection of electronic records which can be sorted using your fingers and a touch screen the size of a movie screen. The data can also be manipulated in various ways to improve collection control. This image was taken at the 2012 E-Records Forum in Austin, Texas. An Open House at the Texas Advanced Computing Center’s Visualization Lab was apparently a “highlight” of the forum.


As promised, it is time to mention the most interesting person I had the chance to talk with at the NAGARA/CoSA Conference in Santa Fe last week. I guess when you ask the right questions “they” will come! By they I mean the smart people… 🙂 After one of the sessions, Mark Conrad an Archives Specialist working with the Applied Research Division (Office of Information Services) of the National Archives and Records Administration (NARA) approached me. He said “aren’t you the one asking about open source solutions?” But of course I was the one! I was so excited to here that NARA is going there!!! I also had the chance to attend a session titled ISO 16363 Audit and Certification of Trustworthy Digital Repositories. The session was delivered by Mark and Technology Specialists from Kentucky. This “Archives Specialist” slash technical guru immediately started rattling off a list of tools and projects that I should take a closer look at. Using his tricked out iPad he started prompting his screen to pop my top. Mark works in the Center for Advanced Systems and Technologies (NCAST). In his position with NARA, he works with computer scientists and engineers from all over the world “to leverage new theories, knowledge, methods, and techniques to advance the lifecycle of electronic records.” Part of the mission of his division includes looking into “emerging technologies.” I must say I about did a back flip when Mark pulled up images of a Visualization Lab in the works. Simply mind blowing! There it was— a virtual filing cabinet. As an archivist, I would be able to process or arrange and describe electronic records by using my fingers and a touch screen. Yes- a touch screen- a virtual system used to arrange collections and sort data- with color codes and all. The volume of records in a particular series is proportional to the amount of data within a particular sector of the collection. In January of 2011, the web administrator of NARAtions: The Blog of the United States National Archives interviewed Mark Conrad. She asked him what he was working on and he said “with the assistance of 17 student interns, I am collaborating on a number of projects. For example, many of the students are currently loading large numbers of files into a testbed that is being used by the computer scientists working on the CI-BER project. The purpose of the project is to provide insights into the management of very large data collections. As the number of files and bytes in a collection goes up some of the systems used to manage the collection break down. This project will help us to identify some of the bottlenecks and look for better ways to build systems that don’t break down as the volume picks up.” He also said he was working with the “Department of Energy, NIST, Naval Sea Systems Command, Army Research Lab, and other Federal Agencies on ways to share information about current and emerging practices for managing and preserving engineering data for as long as it is needed.” Sometimes I am glad that I ask a grippa questions— if I didn’t care about open source solutions, I would have never met one of the most interesting archivists with a technical background ever.

The New Raspberry: A Computer “Crashing” the Competition

March 1, 2012
There is high demand for the low cost Raspberry Pi as educators in the UK join the open source revolution!

There is high demand for the low cost Raspberry Pi as educators in the UK join the open source revolution!

*****Now that’s what I’m talkin’ ’bout!!!!!***** I wish I could be a kid in the UK… It’s nice to see the educators there are ready to nerd up. That’s where it’s at! It would be nice to see USA educators do the same, and join the open source revolution!
…………………………….

$35 Computer Goes on Sale
Published by Orion Jones on March 1, 2012 via BigThink at http://bigthink.com/ideafeed/35-computer-goes-on-sale

What’s the Latest Development?

The $35 Raspberry Pi computer, which is now on sale to the public, has been met with extremely high demand. The computer is sold without a keyboard or monitor and is mainly a product of the English academy and the UK tech industry. With ports for a mouse, keyboard and a high-speed internet cable, the device can be connected to any computer monitor. “Massive demand for the computer has caused the website of one supplier, Leeds-based Premier Farnell, to crash under the weight of heavy traffic.” 

What’s the Big Idea?

The Raspberry Pi Foundation envisions that the device, which runs on the open source platform Linux, will be used to teach new generations of school children how to program computers. Its release comes at a time when the UK is considering shifting the direction of its national education agenda to emphasize computer programming skills, which many consider essential in today’s world. Although the Foundation wanted the device to be made in the UK, the computer will be assembled in China. A $25 version will go on sale later in the year. 

Photo credit: wikimedia commons

11*****Posted using WordPress for BlackBerry*****11

FYI: Symantec Has Been Hacked

February 9, 2012

If you are one of the countless people using Symantec enterprise products, this information is for you. It is now confirmed that the Symantec source code was snatched up by a hacker with the code name “Yama Tough” in late January. I read several reports tonight, and it is unclear if the entire source code has been released, or if parts of the code have been released? What is clear is that at least some of the code has been posted to file sharing sites such as BitTorrent. This could compromise your private data if you use Symantec. Some of the products at risk include: Norton Internet Security; Norton Antivirus Corporate Edition; Systemworks; and PCAnywhere. Before releasing the source code through file sharing programs, “Yama Tough” asked for 50 grand to keep the code secret. Apparently the hacker was unhappy with the corporation, and started to share the code. You may want to switch to a new product until all this is ironed out if you are indeed using Symantec products.

11*****Posted using WordPress for BlackBerry*****11

Covert Ops on Ops: Apple’s Possible Move from Intel-based Macs to ARM-based Macs

February 7, 2012

Thesis reveals a secret project on Apple’s OS architecture.


***Apple intern’s thesis leaks secret project to port Mac OS X to ARM processors***

Originally published online via Apple Insider on 2.7.2012 by Josh Ong

An academic paper written by a former Apple intern who now serves as a Core OS engineer at the company has revealed that it was working on a secret experiment to port Mac OS X Snow Leopard to the ARM architecture.

In 2010, Tristan Schaap published a Bachelor thesis on his 12 week stint as an intern with Apple’s Platform Technologies Group, a subdivision of the Core OS department. The thesis was originally embargoed because it contained sensitive information, but it was eventually published by the Netherland’s Delft University of Technology several months ago, as reported by iMore.

According to the paper, Schaap worked with the group to get Darwin, the “lower half” of Apple’s Mac OS X operating system, to boot onto an ARM processor from Marvell. During the course of the project, he achieved his goal of “booting into a multi-user prompt,” though some issues still remained due to a “poor implementation on the debug hardware.”

It is, however, highly possible that Apple’s explorations into porting Mac OS X to the ARM architecture were not meant to ever ship in an actual product. The company has been known to place new engineers on decoy projects in order to determine their trustworthiness.

But, it is interesting to note that, according to Schaap’s LinkedIn profile, he joined Apple as a “CoreOS Engineer” after graduation and has worked there for almost a year and a half. His profile lists his 2009 intern position as an “Embedded Bringup Engineer.”

Schaap wrote in his thesis that he faced three technical issues during the 12-week project. Having to create a build system, including a filesystem and kernelcache, from the ground up was one of the obstacles. A stale kernel source was also a problem, since bugs snuck in due to the ARMv5 branch of XNU not having been exercised “in a long time.” Finally, Schaap said issues with the JTAG debugger resulted in an “entire instruction set” being unusable.

In order to get the product ready to ship, Schaap noted that the L2 cache would need to be reworked. Several more drivers would also need to be written for the hardware in order to “fully utilize the potential.” Also, Schaap recommended that several applications be written or ported from other platforms since the userland the team had ported was “not enough to perform the tasks the unit needs to perform.”

Though rumors that Apple has been interested in switching from Intel-based Macs to ARM-based ones have been around for some time, one analyst poured cold water on that likelihood last week after a meeting with Apple CEO Tim Cook. Citi’s Richard Gardner said he walked away from the meeting “with the impression that Apple feels iPad satisfies–or will soon satisfy–the needs of those who might have been interested in such a product” as an ARM-based MacBook Air.

Misek had previously predicted that Apple would being merging Mac OS X and iOS this year with the release of an A6-powered MacBook Air. Last May, a rumor surfaced that Apple had built a test MacBook Air with the same ARM-based A5 processor that was used in the iPad 2. Company executives reportedly felt the prototype performed “better than expected.”

Speculation that Apple would port OS X to ARM has also been fueled by the fact that Microsoft announced early last year that Windows 8 will run on the ARM architecture. However, Microsoft’s strategy differs from Apple in that it is making plans for tablets with a full desktop operating system accompanied by a Metro UI layer on top that is optimized for touch. For its part, Apple has itself preferred to take inspiration from the iPad and bring it back to the Mac, rather than the other way around.

From PowerPC to Intel

Apple spent years preparing for the last major architecture switch on the Mac: the move from PowerPC to Intel. In fact, former executives revealed that the company’s failed effort to port Mac OS to Intel was one of the circumstances that brought co-founder Steve Jobs back to the company. The failure apparently made it clear to Apple that it needed to modernize its operating system, so it decided to purchase NeXT, which Jobs had founded after leaving Apple, to do so.

Jobs went on to accomplish the company’s goals, first modernizing Mac OS in 2001 with the release of Mac OS X and then announcing the switch to Intel in 2005. Parallel Intel-compatible versions of Mac OS X existed alongside the official PowerPC variants for five years prior to the switch, as Jobs reportedly had wanted to go with Intel back then, though he ultimately decided to adopt the G5 processor.

DARPA Brainz Enhance Reality Using Contact Lenses!

February 5, 2012
Defense Advanced Research Projects Agency/US Dept. of Defense... Researchers Create New Contacts

Defense Advanced Research Projects Agency/US Dept. of Defense… Researchers Create New Contacts

Yet another one of my dream jobs would be to work for the Defense Advanced Research Projects Agency (DARPA). The agency is part of the United States Department of Defense. These knowledge eaters spit fire and develop mind-blowing computer technology for the military. The agency also has a slick mirrored building as the “headquarters” in Arlington, Virginia. Now check out this hot news. Wow!!! I am amazed by this!!! This is real and unbelievable- really… I am due for an eye appointment- wonder if Dr. B can fit me for some of these puppies???
*****

DARPA researchers design eye-enhancing virtual reality contact lenses

Originally published online by DARPA on January 31, 2012 

Currently being developed by DARPA researchers at Washington-based Innovega iOptiks are contact lenses that enhance normal vision by allowing a wearer to view virtual and augmented reality images without the need for bulky apparatus.  Instead of oversized virtual reality helmets, digital images are projected onto tiny full-color displays that are very near the eye.  These novel contact lenses allow users to focus simultaneously on objects that are close up and far away.  This could improve ability to use tiny portable displays while still interacting with the surrounding environment.

Developed as part of DARPA’s Soldier Centric Imaging via Computational Cameras (SCENICC) program, SCENICC’s objective is to eliminate the ISR capability gap that exists at the individual Soldier level.  The program seeks to develop novel computational imaging capabilities and explore joint design of hardware and software that give war fighters access to systems that greatly enhance their awareness, security and survivability.

Please direct all media queries to: DARPAPublicAffairsOffice@DARPA.mil

11*****Posted using WordPress for BlackBerry*****11

Make Time to Navigate the Heavens

January 28, 2012

The Seduction of Virgo

Today (technically yesterday- it’s after midnight) I made an extraordinary discovery! I was interested in the exact time of the sunset and sunrise so that I could capture a devoted look over the weekend. In my search for information on this, I located the web site of the Astronomical Applications Department of the U.S. Naval Observatory.

Data services on the site include: the complete sun and moon data for one day; a rise/set/twilight table for an entire year (including the times for major solar system objects and bright stars); what the moon looks like right now; the dates of the primary moon phases; altitude (height) and azimuth (angle) of the sun and moon; information about the day and night across the earth; information on equinoxes and solstices; as well as date and calendar conversion charts.

There are also several computers available (solar eclipse, lunar eclipse, and transits of Mercury and Venus). One of the coolest things I found on this site was the Astronomical Information Center. There is a section devoted to astronomical phenomena and celestial navigation. This section features a computing almanac for major solar system bodies and navigational stars.

I adore research sites!! This site is way cool!!
You have to check it out…

Astronomical Applications Department of the U.S. Naval

http://aa.usno.navy.mil/data/

In My Dreams: Long-term Preservation of Computer Games and Virtual Worlds

December 15, 2011

The Open Planets Foundation sent me this information today… In my dreams I can attend this symposium! Talk about a ***dream job***… The keynote speaker Dr. Jerome McDonough works with the iSchool at the University of Illinois. He is part a project called “Preserving Virtual Worlds.” Then there is presenter Tom Woolley. Woolley is a Curator of New Media at the National Media Museum in the UK. Should I cry now or later???   😉   For anyone out there who can attend this event, take notes for me ok?! It is sure to be a thought provoking symposium.

*********** Sent: Thu 12/15/2011 2:58 AM***********

I am writing to let you know that bookings are now open for the next POCOS Symposium, on the issues relating to the long-term preservation of Computer Games and Virtual Worlds. This Symposium will be held in The Cardiff Novotel, where POCOS have arranged special discount rates on rooms, together with free delegate parking and wifi. To take advantage of this discount, simply contact the hotel (address above) and tell them that you are attending the POCOS event. The event will be free to attend, but you are asked to make a contribution of £10 towards the cost of coffee and lunches on the two days. Preservation of video games and virtual worlds presents challenges on many fronts, including complex interdependencies between game elements and platforms; online, interactive and collaborative properties; and diversity in the technologies and practices used for development and curation. This exciting two-day symposium will provide a forum for participants to discuss these challenges, review and debate the latest developments in the field, witness real-life case studies, and engage in networking activities.

The symposium will promote discussion on such topics as:

***Implications and advances in preserving video games and virtual worlds

***Issues of recreating complex technical environments in terms of mods, cracks, plug-ins, joysticks etc. for both console and PC games

***The overriding need to provide an authentic user experience for preserved games

***The Economical Case for re-releasing old games

***Legal and Ethical issues in collecting, curating and preserving virtual worlds

***Interpretation and Documentation, especially metadata

Keynote Speakers and presenters include:

***Dr Jerome McDonough – The iSchool, University of Illinois, USA / Preserving Virtual Worlds Project

***Prof. Richard Bartle FRSA – University of Essex, UK and creator of MUD1

***Dr Dan Pinchbeck – TheChineseRoom, UK and creator of ‘Dear Esther’

***Tom Woolley – Curator of New Media, National Media Museum, UK

***Further speakers have been invited and confirmation of attendance is awaited.

The programme also includes break-out sessions for participants to discuss key topics in the preservation of games and virtual worlds. You can download the event brochure at: http://www.pocos.org/images/pub_material/POCOS_3_LEAFLET_V1.pdf  For more information, please visit the POCOS page at:http://www.pocos.org/index.php/pocos-symposia/software-art

A Christmas Star Explosion: Neutron Star Spirals Into the Heart of a Companion Star

December 8, 2011

My friend Norma works with the Chemistry and Metallurgy Research Replacement Project – Los Alamos National Laboratory. Yesterday she sent me a super cool email with this small article from the lab, and I wanted to share it… I loved this- so very interesting! Computational science, code, stars, spirals, what’s not to love? Unfortunately, I couldn’t get to the LANL news release on this study because it required a login, but I will ask one of my parents to look this up for me. I want to see if there is more information on the computational code used to study the star collision. Hummm?

***********

Christmas Burst Reveals Neutron Star Collision

December 6, 2011

***********Christmas Burst, GRB 101225A; NASA*********** Goddard Space Flight Center

Old model, new data: a match made in the heavens. A strangely powerful, long-lasting gamma-ray burst on Christmas Day, 2010 has finally been analyzed to the satisfaction of a multinational research team. Called the Christmas Burst, GRB 101225A was freakishly lengthy and it produced radiation at unusually varying wavelengths. But by matching the data with a model developed in 1998, the team was able to characterize the star explosion as a neutron star spiraling into the heart of its companion star. The paper titled, “The unusual gamma-ray burst GRB 101225A from a helium star/neutron star merger at redshift 0.33,” appeared in a recent issue of the journal Nature. Christina Thöne of Spain’s Instituto de Astrofísica de Andalucía is the lead author, and Los Alamos computational scientist Chris Fryer is a contributor. Fryer, with the Lab’s Computer, Computational, and Statistical Sciences Division, realized that the peculiar evolution of the thermal emission (first showing X-rays with a characteristic radius of ~1011 cm followed by optical and infra-red emission at ~1014 cm) could be naturally explained by a model he and Stan Woosley of the University of California at Santa Cruz had developed in 1998. “The Helium Merger Model explained all the properties we were seeing,” Fryer said, although he noted that proving this required a series of additional computational models by the international theory team studying this “Christmas burst” and the work is still under way. Fryer is working with Wesley Even of the Los Alamos X Theoretical Design Division, using the U.S. Department of Energy’s Advanced Simulation and Computing Codes to study the emission of this burst in more detail.
For more information, see the LANL news release.


betsyrandolph's Blog

4 out of 5 dentists recommend this WordPress.com site Or so I've been told.

Ebony and Crows

A dark spill of worlds and words

Dr. Eric Perry

Psychology to Motivate | Inspire | Uplift

Krivs Studio Blog

Profiles, Features, Interviews, Contest News and more from the Studio

Premier Performance

Become Your Best

Discover WordPress

A daily selection of the best content published on WordPress, collected for you by humans who love to read.

Matiuadex Gallery

Movies, Music, Celebrity, Gists, life style and many more

FITNESS SALVATION

Fitness Without The Fluff

Taylor Network of Podcasts

Podcast, News and Articles

deverepaynept.wordpress.com/

Build the best version of you!

DEDRIAN E COLON

SOCIAL MEDIA CONSULTANT

I didn't have my glasses on....

A trip through life with fingers crossed and eternal optimism.

%d bloggers like this: