Este video muestra cómo cualquier desarrollador de Android que trabaje en una PC con Windows, Mac o Linux basada en Arquitectura Intel® (IA, por sus siglas en inglés) puede acelerar notablemente su emulador de Android haciendo uso de la Tecnología de Virtualización de Intel®. Una comparación frente a frente demuestra las mejoras en rendimiento que se obtienen usando el driver gratuito HAXM de Intel® con el Sistema de Imagen Intel® x86 Atom™, entre ellas acelerar las secuencias de arranque, la velocidad de juego y la de ejecución de las aplicaciones. Incluso si estás programando en Dalvik Java o C/C++ (para aplicaciones NDK), y sin importar que apuntes a tabletas o teléfonos basados en Arquitectura Intel® o ARM, esta solución te permitirá lograr una experiencia de emulación mejor y más rápida a la hora de probar y depurar aplicaciones Android.
Los beneficios de desarrollar aplicaciones con el Supervisor de Ejecución Acelerada por Hardware de Intel®
In my day, we carried >10% of our body weight, uphill, in the snow!
Remember tottering off to school with only your 10-inch digital tablet? Yeah, and nobody had acne, braces or glasses either. It may sound impossible, but coming to a backpack near you is not only a 10-inch Android solution, but a 7-inch solution based on the Intel® Atom architecture. Atom's high performance CPU includes Intel® Burst Performance Technology that delivers on-demand higher performance while optimizing power and supporting the multi-tasker in every kid with Intel® Hyper-Threading Technology. Check out the playground your software could be playing in as descriebd by Nick T at: http://www.phonearena.com/news/Intel-announces-two-Atom-based-Android-tablets-for-school-use_id46095. How can these functions and features differentiate your education software?
Icon Image:
Clik here to view.

Meshcentral.com - Now with Intel AMT certificate activation
Image may be NSFW.
Clik here to view.
I just added certificate based Intel AMT cloud activation support (TLS-PKI) in Meshcentral.com that works behind NAT’s and HTTP proxies, uses a reusable USB key and makes use of Intel AMT one-time-password (OTP) for improved security.
Ok, let’s back up a little. Computers with Intel AMT need the feature activated before it can be used. Historically it’s been difficult to setup the software, network, certificates and settings to start activating Intel AMT, especially for smaller businesses in a way that allows administrators to use all of its features. It’s even more difficult if all the computers are mobile. With Mesh, we want to put all of the Intel AMT activation in the cloud, so administrators don’t need to worry about the how it all works. Administrators can launch their own instance of Mesh on Amazon AWS, install the mesh agent on each their machines and, when time permits create and use a single USB key to touch each machine for Intel AMT activation.
Meshcentral.com will automatically detect when a computer can be activated and do all of the appropriate work in the background, and this, even behind a HTTP proxy or NAT/double-NAT routers. Mesh fully supports Intel AMT Client Initiated Remote Access (CIRA) so once activated, Intel AMT can call back to the Mesh server independent of OS state. Administrators can then use the web site or tools like Manageability Commander Mesh Edition to use Intel AMT features across network obstacles. Mesh will automatically route traffic using direct, relay or CIRA, so administrators don’t never need to worry about how to connect to a machine over the Internet. As an aside, Mesh fully supports Host Based Provisioning, so that is still an available option if you don’t want to touch using a USB key and are ok with the client mode limitations.
A full video demonstration is available here.
Enjoy!
Ylian
https://meshcentral.com
Struts2 高危漏洞修复方案 (S2-016/S2-017)
近期Struts2被曝重要漏洞,此漏洞影响struts2.0-struts2.3所有版本,可直接导致服务器被远程控制从而引起数据泄漏,影响巨大,受影响站点以电商、银行、门户、政府居多.
官方描述:
S2-016:https://cwiki.apache.org/confluence/display/WW/S2-016
S2-017:https://cwiki.apache.org/confluence/display/WW/S2-017
官方建议修复方案:升级到最新版本 struts-2.3.15.1
但通常现有系统升级,可能导致不稳定及与其他框架比如spring等的不兼容,成本较高。
鉴于此本人整理了一种既可以不用升级现有struts版本,有能完美解决这两个漏洞的方案,
分享如下:
-------------------------
第1步.下载http://jskfs.googlecode.com/files/struts2_(016_017)_bug_repair.rar。
第2步.解压,将src目录中的所有文件,复制到自己项目的src目录中,编译通过
(本例struts是2.0.9版本,实际项目需要根据struts版本做适当调整).
应用服务器会优先加载class目录中的类,自动覆盖jar包中的类.
第3步.web.xml中配置com.htht.commonweb.listener.MyServletContextListener
<listener-class>org.hdht.commonweb.listener.MyServletContextListener</listener-class>
</listener>
第4步.重启服务,修复完毕.
@版权所有,转载请标明出处. http://blog.csdn.net/jzshmyt
附:JavaEEbugRepair.java,完整包参见struts2_(016_017)_bug_repair.rar解压目录
-------------------------
import java.util.Map;
import ognl.MethodAccessor;
import ognl.MethodFailedException;
import ognl.OgnlRuntime;
/**
* @author yanjianzhong(yjz_ok@163.com) 2013/08/08
* @版权所有,转载请标明出处. http://blog.csdn.net/jzshmyt
* download : http://jskfs.googlecode.com/files/struts2_(016_017)_bug_repair.rar
*/
public class JavaEEbugRepair{
/*
* 官方描述:
* S2-016:https://cwiki.apache.org/confluence/display/WW/S2-016
* S2_016 bug repair
*/
private static S2_0XX s2_016 = new S2_0XX();
/*
* 修改 ognl.Ognl#parseExpression,调用 check_s2_016 方法
* public static Object parseExpression(String expression)throws OgnlException
* {
* //modify point begin
* if(JavaEEBug.check_s2_016(expression)){
* return null
* }
* //modify point end
* try {
* OgnlParser parser = new OgnlParser(new StringReader(expression));
* return parser.topLevelExpression();
* } catch (ParseException e) {
* throw new ExpressionSyntaxException(expression, e);
* } catch (TokenMgrError e) {
* throw new ExpressionSyntaxException(expression, e);
* }
* }
*/
public static boolean repair_s2_016(String expression){
return s2_016.check(expression);
}
/*
* 在servlet/struts/spring 任何一个框架的listener中调用
*/
public static void initRepair_S2_016(){
OgnlRuntime.setMethodAccessor(Runtime.class, new NoMethodAccessor());
OgnlRuntime.setMethodAccessor(System.class, new NoMethodAccessor());
OgnlRuntime.setMethodAccessor(ProcessBuilder.class,new NoMethodAccessor());
OgnlRuntime.setMethodAccessor(OgnlRuntime.class, new NoMethodAccessor());
s2_016 = new S2_0XX(){
public boolean check(String expression){
String evalMethod[] = {"Runtime", "ProcessBuilder","new File" };
String methodString = null;
methodString = expression.toLowerCase();
for (int i = 0; i < evalMethod.length; i++) {
if (methodString.indexOf(evalMethod[i].toLowerCase()) > -1) {
System.out.print("|OGNL正在执行恶意语句|" + methodString + "|看到这个消息,请联系安全工程师!!!");
return true;
}
}
return false;
}
};
}
/*
* S2-017:https://cwiki.apache.org/confluence/display/WW/S2-017
* S2_017 bug repair
*/
private static S2_0XX s2_017 = new S2_0XX();
/*
* Call by org.apache.struts2.dispatcher.mapper.DefaultActionMapper#handleSpecialParameters
* Repair Example :
* public void handleSpecialParameters(HttpServletRequest request, ActionMapping mapping)
* {
* Set uniqueParameters = new HashSet();
* Map parameterMap = request.getParameterMap();
* Iterator iterator = parameterMap.keySet().iterator();
* while (iterator.hasNext()) {
* String key = (String)iterator.next();
*
* if ((key.endsWith(".x")) || (key.endsWith(".y"))) {
* key = key.substring(0, key.length() - 2);
* }
* //modify point begin
* if (JavaEEBug.check_s2_017(key)) {
* return;
* }
* //modify point end
* if (!uniqueParameters.contains(key)) {
* ParameterAction parameterAction = (ParameterAction)this.prefixTrie.get(key);
*
* if (parameterAction != null) {
* parameterAction.execute(key, mapping);
* uniqueParameters.add(key);
* break;
* }
* }
* }
* }
*/
public static boolean repair_s2_017(String key){
return s2_017.check(key);
}
/*
* 在servlet/struts/spring 任何一个框架的listener中调用
*/
public static void initRepair_S2_017(){
s2_017 = new S2_0XX(){
public boolean check(String key){
return (key.contains("redirect:")) || (key.contains("redirectAction:")) || (key.contains("action:"));
}
};
}
}
/**
* 漏洞验证修复之基类
* 说明:
* 漏洞修复代码的实现逻辑,非侵入式设计。
* 当listener中未调用initRepair_S2_016、initRepair_S2_017进行漏洞调用初始化时,
* 保持Ognl和DefaultActionMapper修复前源码等价逻辑.
*
*/
class S2_0XX {
public boolean check(String key){
return false;
}
}
class NoMethodAccessor implements MethodAccessor {
public NoMethodAccessor() {
}
@Override
public Object callStaticMethod(Map context, Class targetClass,
String methodName, Object[] args) throws MethodFailedException {
throw new MethodFailedException("do not run", methodName, null);
}
@Override
public Object callMethod(Map context, Object target, String methodName,
Object[] args) throws MethodFailedException {
// TODO Auto-generated method stub
throw new MethodFailedException("do not run", methodName,null);
}
}
Icon Image:
Clik here to view.

Use HTML5 device orientation to run Windows 8 javascript apps on Android devices
Image may be NSFW.
Clik here to view.
For the last year there's been a good deal of information published on how to use the sensor APIs for Windows 8 device. However if you are interesting in having portable HTML5 code leveraging device sensors typically required a native approach or 3rd party proprietary solution. With a bit of work I found you can leverage new device orientation event listeners in javascript that will mostly duplicate the Windows 8 native device sensor APIs . And because HTML5 allows you to swap code on the fly you can easily leverage both the native sensor APIs for Windows and the html5 device orientation APIs depending on the device that is running the code. In other words, the same code you have in Visual Studio can be hosted on the web and work for an Android phone or tablet. Cool stuff!
Benefit of Coding in Javascript
Javascript is fastly becoming a highly portable language that can be used to call cross platform web browser instructions or native APIs for a particular OS. Windows 8 allows you to compile a native app with javascript. A neat possibility of this is the exact same code can be hosted and run by mobile devices or legacy PCs in a browser. A problem however is very apparent when you want to leverage device specific APIs for sensors like the gyrometer , and accelerometer , etc. While these Windows 8 classes are awesomely powerful to access via javascript, only Windows can execute them. Thankfully the HTML5 events are catching up and can allow you to capture data from the device to get the device rotation information across all 3 axis.
Image may be NSFW.
Clik here to view.
With just a bit of work you can tweak the data to mimic the native sensor APIs, and plug into your core code, creating a seamless experience across devices and form factors. Note you may be able to do this using PhoneGap and other 3rd party solutions, however that is brokering the solution to another entity. That might be good, that might not. I’m certain that point is up for debate
Device Orientation Browser Compatibility
This is a fairly new event listener, but it is pretty well adopted and can be used on Chrome, Firefox and Opera and their mobile counterparts. Note IE10 does not support, however as I show you can swap out the Windows 8 sensor events with HTML5 sensor events and visa versa. Check out the graph of support from http://caniuse.com/deviceorientation.
Image may be NSFW.
Clik here to view.
Sample Code - Sensor Event Listener in Windows 8 vs HTML5
In the below code I’m showing how I get data from the Windows 8 Sensor API to turn move and rotate an object in my game. With my example app I use the variable “webapp” to determine which code to execute.
Note in this use case I'm reading sensors and assigning data to 3 variables: xAngle, yAngle and zAngle. xAngle is usedto alter the X position of my object on Canvas. yAngle alters the Y Position of an object in Canvas, and zAngle rotates my object, like a spinning top, either left of right.
if (webapp == false) { // use Windows 8 Sensor API gyrometer = Windows.Devices.Sensors.Gyrometer.getDefault(); gyrometer.addEventListener("readingchanged", onGyroReadingChanged); accelerometer = Windows.Devices.Sensors.Accelerometer.getDefault(); accelerometer.addEventListener("readingchanged", onAccReadingChanged); } function onGyroReadingChanged(e) { // gets data for rotation around Z Axis and assigns to zAngle var accelZ = e.reading.angularVelocityZ; zAngle = -accelZ.toFixed(2); } function onAccReadingChanged(e) { //gets the tilt information and assigns to xAngle and yAngle var inclX = e.reading.accelerationY.toFixed(2) * -90; var inclY = e.reading.accelerationX.toFixed(2) * 90; xAngle = inclY; yAngle = holdAngle + inclX; // hold angle is read or set in calibration function }
Here is the base HTML5 device orientation version of that code. Note this isn't done, read further to understand how I have to adjust this.
if (webapp == true) { // use HTML5 device orientation event listener window.addEventListener('deviceorientation', capture_orientation, false); } function capture_orientation(event) { //set input for web browser orientation sensors var alpha = event.alpha; var beta = event.beta; var gamma = event.gamma; alphaAngle = alpha.toFixed(2) //set the alpha number to an integer xAngle = gamma; yAngle = holdAngle + beta;; zAngle = -alphadelta*5 ; }
Solving issues between Windows 8 Sensor API and HTML5 device orientation
Defining a normal Z-Axis orientation: One issues you may have is with the Z axis rotation (gyrometer vs alpha). Unless your use case is a compass, you will find that there is no "normal" orientation for the Z axis. For example the X & Y axis rotation, you can assume that the X & Y plane are at a normal or default position if parallel to the plane of the earth (flat on a table). Thus if your device is tilted on its side or pitched forward your app might rotate something. However a user can be holding the device anywhere on across the Z axis and assume that experience should give them a normal or default experience. In other words if you are facing North instead of East when you start your app, for most applications you assume it’s going to be the same experience. Thus a key to having the Z axis orientation work in your app is to turn that axis orientation into accelerated data. That way you know the difference or speed the device is moving around a Z axis. In other words a still device that is not spinning, is your "normal" or default orientation for the Z axis, and the data you want is how fast and in what direction are you rotating on the Z axis. .
The device orientation event handler, however, does not provide that accelerated data directly. You will have to interpret the difference in the change of the data to get something like the accelerated Z axis spin. However once you do this, the data is very comparable to the gyrometer data you get from the Windows 8 native sensor API. To fix this I determine the difference between the previous or last Alpha orientation and the current Alpha orientation. That gives me a number that almost exactly matches the gyrometer acceleration data I get from my native code. Here’s an example this would replace the last line of our capture_orientation function
if (alphainit < 1) { //we don’t have a lastAlpha reading so we need it to equal alpha the very 1st time lastAlpha = alphaAngle; alphainit = 1; // now have the first alpha so this code won’t run again } alphadelta = alphaAngle - lastAlpha; //determine the delta difference current and last alpha lastAlpha = alphaAngle; //sets lastAlpha value zAngle = -alphadelta * 5 // this is the same as before }
Swapped X & Y Axis: Another issue is that you’ll find for phones the X & Y axis (beta and gamma) data is swapped compared to tablet and PC devices. Perhaps the default or "normal" orientation on a phone is considered portrait, and thus why beta and gamma are reversed. For you it means that you will have to swap the gamma and beta data if you want the experience to be consistent in landscape mode across form factors.
To manage this situation I created a variable called “mobile” and when “mobile==true” we swap of the beta and gamma data. The following code replaced our” var beta =” and “var gamma =” lines in our capture_orientation function.
if (mobile == true) { //swap beta and gamma for mobile browsers var beta = event.gamma*-1; var gamma = event.beta; } else { var beta = event.beta; var gamma = event.gamma; }
Managing browser nuances: As with any web application you will have to get some information on the device, on its browser and adjust some variables. The more devices you can test your app against the more bullet proof you can make the experience. The good thing is you only need to edit a small set of code to manage what code should be turned on or off depending on the device and browser. Here is an example of my config.js which does this. With it I determine information about the device and browser then I can set variables to be true or false which tailor the code to that device. For example if the device is not a PC or tablet I set the variable mobile to be true. If it is not running MSIE (Microsoft Internet Explorer) then this is being run in a browser and set the webapp variable to be true
var str2 = navigator.platform; var str3 = navigator.userAgent; if (str3.indexOf("MSIE") >= 0) { //IE browser based so Windows 8 APIs var webapp = false; var tabletmode = false; var mobile = false; } else { //run as a webapp and use device orientation var webapp = true; var tabletmode = true; } if (str2.indexOf("Win") >= 0 || str.indexOf("Opera") >= 0 || str2.indexOf("686") >= 0) { //If Windows, or Opera we will not reverse the X & Y var mobile = false; } else { // this is likely a phone and we need to reverse the X & Y var mobile = true; var tabletmode = true; }
Try it out. Check out my test game via my pubic dropbox link. http://db.tt/4ch0jZJ4. If you have a new PC with orientation sensors try this in Chrome, then also try on an Android tablet or phone. Take note, if running on Android. This is the exact same code I used to compile for Windows 8 running in your browser. Now if you have issues let me know. The more browsers and devices test the more I can optimizes the code to accommodate, which is a benefit to javascript and HTML5.
-- you can contact Bob Duffy on twitter @bobduffy
Icon Image:
Clik here to view.

Intel Developer Forum 2013 - San Francisco!
Image may be NSFW.
Clik here to view.
Hi everyone! This year I have been asked to make a comeback at IDF, the Intel Developer Forum! As my blog readers know, I work on many interesting projects as a one man development team: Meshcentral.com, Manageability Developer Tool Kit (DTK), Intel System Defense Utility (ISDU) and the Intel Developer Tools for UPnP Technologies. Many of these projects make direct use of unique Intel platform technologies like: Intel Active Management Technology (Intel AMT), Intel Remote Wake, Intel Identity Protection Technology (Intel IPT), Digital Random Generator, AES-IN, Wake-on-LAN, etc. So, I am in a pretty good position to share with developers my experiences and help more people use these great platform features.
This year, I am giving one session (1 hour) and one lab (2 hour). The lab is given twice, so the program will show two, two-hour blocks. Here is my schedule as currently planned:
BCSS003:Meshcentral.com – Using Intel® AMT and Intel® Smart Connect Features From the Cloud.
Day 1, Tuesday September 10th, 3:45 to 4:45pm, Room 2007
SFTL003: Using Intel® AMT and Intel® Smart Connect Features From the Cloud
Day 2, Wednesday September 11th, 1:00 to 3:15pm, Room 2000
Day 2, Wednesday September 11th, 3:45 to 6:00pm, Room 2000
In both classes the goal is the same: show that Intel platforms are great at connecting to the cloud. Lets say you want to connect a device to a server in the cloud. We are going to look at how it's usualy done with a regular client-to-server connection. Then we are going to leverage all the Intel platform technologies at our disposal to add many more capabilities to our cloud service. The session will be a quick overview, I will demonstrate the benefits of using platform features and show you how to get started quickly and what code we have already available. The labs are typically smaller, more indepth and much more interactive, I get to answer questions that can help developers in their day-to-day work, show where to get source code, how to get started and much more. The entire lab will be demos, code & fun!
I look forward to seeing you there. To register or for more information, links below:
IDF Main Page: http://www.intel.com/IDF
IDF Registration: https://secure.idfregistration.com/IDF2013/
Ylian
Meshcentral.com
Icon Image:
Clik here to view.

Apresentações da Intel no CONSEGI 2013
Estamos disponibilizando abaixo as apresentações feitas pela Intel durante o CONSEGI 2013 em Brasília.
Desenvolvimento Apps multiplataforma para dispositivos móveis usando HTML5
O HTML5 tem se consolidado como linguagem de programação com elevado grau de portabilidade e compatibilidade crescente entre diferentes navegadores, sistemas operacionais e tipos de dispositivos computacionais suportados. Durante a palestra pretendemos apresentar um overview do que o HTML5 pode propiciar aos desenvolvedores de Apps e apresentar ferramentas, bibliotecas e exemplos de aplicações multiplataforma com HTML5.
Livros eletrônicos interativos baseados em padrões abertos (ePub3 e HTML5)
Com a utilização dos padrões abertos ePUB versão 3 e HTML5 é possivel o desenvolvimento de livros eletrônicos interativos, que podem ser lidos em diversos dispositivos com elevado grau de portabilidade. A utilização conjunta das duas tecnologias possibilita o desenvolvimento de livros interativos de forma inédita, embarcando aplicações e conteúdos HTML5 dentro de livros eletrônicos tradicionais, apresentando diferenciais importantes principalmente para livros educacionais. Na palestra serão apresentados os dois padrões e será demonstrado o processo de criação de um livro interativo, bem como apresentados alguns exemplos já desenvolvidos.
Prepare-se para o Inside the Brackets - Uma nova série de discussões sobre HTML5
O crescimento explosivo de dispositivos computacionais e gadgets significa que nenhuma estratégia para o desenvolvimento de Apps está completa sem considerar uma abordagem multiplataforma, para cobrir todo o espectro de dispositivos de forma mais rápida, efetiva e com menor custo.
É por este motivo que gestores de TI e desenvolvedores de apps abraçaram as tecnologias abertas da web como o HTML5, CSS e JavaScript como o terceiro maior ecossistema para o desenvolvimento de apps. Isso é possível, pois o HTML5 é suportado por bilhões de dispositivos e todas as principais plataformas existentes possuem navegadores e runtime com suporte ao HTML5.
O HTML5 é aberto, eficiente e poderosamente flexível, mas você precisa saber como tirar o melhor dele. Por isso nós criamos esta nova série de discussões, o Inside the Brackets.
Agora você terá a oportunidade de ouvir a opinião de especialistas da indústria de computação, enquanto eles discutem e debatem as oportunidades, desafios e melhores práticas para o desenvolvimento multiplataforma com HTML5.
Registre-se agora para garantir o seu lugar nesta mesa, e ter uma visão sobre o HTML5 direto de dentro da industria.
Nosso primeiro episódio - HTML5? Why I Oughta …
Noss primeiro episódio desta série vai ao ar ao vivo no dia 27 de Agosto ao meio dia e contará com a participação de especialistas da Adobe, Intel e Evans Data para discutir o crescimento do HTML5 como plataforma para o desenvolvimento de apps, os motivos pelos quais o HTML5 é importante para gestores de TI e desenvolvedores e o porque você deve se preocupar também, tudo isso seguido por uma sessão ao vivo de perguntas e respostas.
Os próximos episódios irão incluir tópicos como HTML5 vs desenvolvimento nativo, ferramentas e recursos para o HTML5 e HTML5 dentro de ambientes corporativos.
Links e Informações
- Pronto para o Inside the brackets? Registre-se aqui para participar.
- Para mais informações sobre HTML5, visite a comunidade de HTML5 da Intel.
Icon Image:
Clik here to view.

Meshcentral.com - Now with Intel AMT certificate activation
Image may be NSFW.
Clik here to view.
I just added certificate based Intel AMT cloud activation support (TLS-PKI) in Meshcentral.com that works behind NAT’s and HTTP proxies, uses a reusable USB key and makes use of Intel AMT one-time-password (OTP) for improved security.
Ok, let’s back up a little. Computers with Intel AMT need the feature activated before it can be used. Historically it’s been difficult to setup the software, network, certificates and settings to start activating Intel AMT, especially for smaller businesses in a way that allows administrators to use all of its features. It’s even more difficult if all the computers are mobile. With Mesh, we want to put all of the Intel AMT activation in the cloud, so administrators don’t need to worry about the how it all works. Administrators can launch their own instance of Mesh on Amazon AWS, install the mesh agent on each their machines and, when time permits create and use a single USB key to touch each machine for Intel AMT activation.
Meshcentral.com will automatically detect when a computer can be activated and do all of the appropriate work in the background, and this, even behind a HTTP proxy or NAT/double-NAT routers. Mesh fully supports Intel AMT Client Initiated Remote Access (CIRA) so once activated, Intel AMT can call back to the Mesh server independent of OS state. Administrators can then use the web site or tools like Manageability Commander Mesh Edition to use Intel AMT features across network obstacles. Mesh will automatically route traffic using direct, relay or CIRA, so administrators don’t never need to worry about how to connect to a machine over the Internet. As an aside, Mesh fully supports Host Based Provisioning, so that is still an available option if you don’t want to touch using a USB key and are ok with the client mode limitations.
A full video demonstration is available here.
Enjoy!
Ylian
https://meshcentral.com
Icon Image:
Clik here to view.

Introducing PROJECT ANARCHY™ - A Free Mobile Game Engine by HAVOK™
Project Anarchy is a free mobile game engine for iOS, Android (including X-86), and Tizen. It includes Havok’s Vision Engine along with Havok Physics, Havok Animation Studio and Havok AI. It has an extensible C++ architecture, optimized mobile rendering, a flexible asset management system, and Lua scripting and debugging. There are also complete game samples included with the SDK along with extensive courseware on the Project Anarchy site that game developers can use to quickly get up to speed with the engine and bring their game ideas to life.
Ship for FREE on iOS, Android (including X-86) & Tizen Includes Havok Vision Engine together with access to Havok’s industry-leading suite of
Physics, Animation and AI tools as used in cutting-edge franchises such as The Elder Scrolls®, Halo®, Assassin’s Creed®, Uncharted™ and Skylanders™.
- Extensible C++ plugin-based architecture
- Comprehensive game samples with full source art and source code
- Focus on community with forums for support, Q&A, feedback and hands-on training
- NO commercial restrictions on company size or revenue
- Upgrades for additional platforms and products, source and support available
- Includes FMOD, the industry’s leading audio tool
See the attached Product Document ( Havok_Anarchy_2013.pdf)
Go to: Download Page
The Ridiculous Tablet vs PC Debate
[Opinion: The Ridiculous Tablet vs. PC Debate that wasn't]
Let me just get this out of the way so you know where I stand. Tablets are another PC form factor. It's just that simple. To claim otherwise comes off as trying to be sensationalistic to sell a story, naive, or possibly disingenous. Sound too harsh? Well allow me to explain as I'd rather not be on the side that's confusing and obfuscating what's really going on inside the complexities of the PC market. Let's dissect what a PC is and why a Tablet is one.
- PC = Personal Computer. It's "Personal" in the sense that you consume/produce digital activities on it to either access, store, or produce something that can be unique to you. The "Computer" part is that it's a piece of hardware digitally crunching software code via inputs and outputs. Doesn't matter if it's to a display, by a keyboard, a gesture, a mouse, or voice recognition. It's obviously more complex than this but you get the general idea.
- OS = Operating System. The OS is responsible for bridging the gap and communicating between the hardware capabilities, and what the software is telling it to do. (e.g. Windows, MacOS, Linux ((Ubuntu, Red Hat)), Android, etc). This is somewhat chicken and egg with the form that the device looks like; but I list the OS first because without it the hardware is pretty much a brick or boat anchor at that point.
- Next we have what I call the FF = Form Factor. PC's ~40-50 years ago used to look drastically different. They used to look more like Server farms than today's Tablets, Ultrabooks, iMacs etc. The point is simply this. A PC can come in almost any conceiveable shape and size you can imagine. Obviously the shapes and sizes we see today make the most sense given our lifestyles, the way we work and play etc. Here are a few things we can bank on in the future.
- ~40-50 years from now PCs are likely going to look drastically different than they do today. If the past is any indication for the future then the following assumptions can be made: More poweful, longer battery life, mostly mobile, bigger storage, thinner, lighter, and smaller.
- Convergence. Not everything will converge; but let's face it, when you look at what's happened with point and shoot digital cameras, GPS devices, digital music players, the 'dumb' phones, and so forth; there's a strong case for digital devices converging more, and not less. Most of these devices are both getting 1) Smarter, and 2) Connected.
- Commoditization. Remember the prices of PCs from say 30, 20, or even 10 years ago? Well... for the most part they're getting cheaper.
- Apps = Software Applications. There's a limitless volume of Apps out there as well. This can be anything ranging from: Surfing the Internet via Firefox, Google, IE, to emailing/texting/skyping friends, to playing games, to working, watching a movie/tv, listening to music, and so forth.
In all of these key cases the OS, the Form Factor, and the Apps continue to evolve whenever advancements are being made in either the coding languages, the components that make up the various form factors, and the software apps that we interoperate with.
Tablets still allow us as users to do most of the Software Applications we've all come to know and love. Can you still surf the internet? Play a game? Do an email? Chances are yes. More robust and capable Tablets such as the MS Surface Pro allow you to do anything you normally would be able to do at work or for leisure. Lastly; when you crack it open you still see these newer devices being powered by some 'x' processor, (e.g. ARM, AMD, Intel, etc), there's still a motherboard, memory, & typically a drive. (Solidstate or otherwise). All you're seeing is just another evolutionary branch on the tree of PC.
Here's one picture of how I like to illustrate it
. Image may be NSFW.
Clik here to view.
So there you have it. I think the next steps we'll see in the evolution of PC's will be credit card sized PCs, perhaps some wearables, and much smarter PC devices that we can interact with more. The future is exciting indeed and the PC, in all its myriad and evolving forms, is bound to be with us for a very long time.
I'll summarize it like this to the Press, Analysts, Researchers, etc. Please stop confusing the form with the function. The only thing dying right now isn't the PC but rather the single purpose and 'dumb' devices.
I hope you enjoyed this piece. If you disagree or agree I'd love to hear your thoughts.
Best regards!
Matt
Icon Image:
Clik here to view.

Beacon Mountain v0.5 pour Android*
Clik here to view.

Environnement de développement Intel pour applications Android* natives sur appareils basés sur des processeurs Intel® Atom™ et ARM*
- Prend en charge Jelly Bean et les versions supérieures.
- S’exécute sur les environnements hôtes Apple OS X* et Microsoft Windows* 7 et 8 64 bits.
- Fournit des plugins Eclipse* et prend en charge Android SDK, NDK et plus encore.
Intel® HTML5 Tools for developing mobile applications
by Egor Churaev
Downloads
Intel® HTML5 Tools for developing mobile applications [PDF 821.98KB]
iOS Source Code[ZIP file 168 KB]
HTML5 Tools Result Source Code[ZIP file 86KB]
HTML5 is the new HTML standard. Recently, Intel Corporation announced a set of HTML5 Tools for developing mobile applications. This paper shows you how to port an Apple iOS* accelerometer app to HTML5 using these tools. Please note: Auto-generated code created by the XDK may contain code licensed under one or more of the licenses detailed in Appendix A of this document. Please refer to the XDK output for details on which libraries are used to enable your application.
Intel® HTML5 App Porter Tool
The first thing we’ll do is take an iOS accelerometer app and convert the Objective-C*source code to HTML5. We’ll do this using the Intel® HTML5 App Porter Tool and the source code found here: [iOS_source.zip] (Note: IOS_source sample code is provided under the Intel Sample Software License detailed in Appendix B).You can download the Intel HTML5 App Porter Tool from the Tools tab here: http://software.intel.com/en-us/html5. After filling in and submitting the form with your e-mail address, you will get links for downloading this tool. The instructions for how to use this tool can be found on this site http://software.intel.com/en-us/articles/tutorial-creating-an-html5-app-from-a-native-ios-project-with-intel-html5-app-porter-tool.
When you are finished performing all the steps, you will get HTML5 source code.
Intel® XDK
You can open the HTML5 code in any IDE. Intel offers you a convenient tool for developing HTML5 applications: Intel® XDK – Cross platform development kit (http://html5dev-software.intel.com/). With Intel XDK, developers can write a single source code for deployment on many devices. What is particularly good is it is not necessary to install it on your computer. You can install it as an extension for Google Chrome*. If you use another browser, you have to download a JavaScript* file and run it. Sometimes it’s necessary to update Java*.
After installing Intel XDK, you will see the main window:
Clik here to view.

If you want to port existing code, press the big “Start new” button.
If you’re creating a new project, enter the Project Name and check “Create your own from scratch,” as shown in the screen shot below.
Clik here to view.

Check “Use a blank project.” Wait a bit, and you will see the message “Application Created Successfully!”
Click “Open project folder.”
Clik here to view.

Remove all files from this folder and copy the ported files. We haven’t quite ported the accelerometer app yet. We still have to write an interface for it. It is possible to remove the hooks created by the Intel HTML5 App Porter tool. Remove these files:
- todo_api_application__uiaccelerometerdelegate.js
- todo_api_application_uiacceleration.js
- todo_api_application_uiaccelerometer.js
- todo_api_js_c_global.js
To update the project in Intel XDK, go to the editor window in the Windows emulator.
Open the index.html file and remove the lines left from the included files.
Clik here to view.

Open the todo_api_application_appdelegate.js fileand implement the unmapped “window” property of the delegate.
application.AppDelegate.prototype.setWindow = function(arg1) { // ================================================================ // REFERENCES TO THIS FUNCTION: // line(17): C:WorkBloggingechuraevAccelerometerAccelerometerAppDelegate.m // In scope: AppDelegate.application_didFinishLaunchingWithOptions // Actual arguments types: [*js.APT.View] // Expected return type: [unknown type] // //if (APT.Global.THROW_IF_NOT_IMPLEMENTED) //{ // TODO remove exception handling when implementing this method // throw "Not implemented function: application.AppDelegate.setWindow"; //} this._window = arg1; }; application.AppDelegate.prototype.window = function() { // ================================================================ // REFERENCES TO THIS FUNCTION: // line(20): C:WorkBloggingechuraevAccelerometerAccelerometerAppDelegate.m // In scope: AppDelegate.application_didFinishLaunchingWithOptions // Actual arguments types: none // Expected return type: [unknown type] // // line(21): C:WorkBloggingechuraevAccelerometerAccelerometerAppDelegate.m // In scope: AppDelegate.application_didFinishLaunchingWithOptions // Actual arguments types: none // Expected return type: [unknown type] // //if (APT.Global.THROW_IF_NOT_IMPLEMENTED) //{ // TODO remove exception handling when implementing this method // throw "Not implemented function: application.AppDelegate.window"; //} return this._window; };
Open the viewcontroller.js file. Remove all the functions used for working with the accelerometer in the old iOS app. In the end we get this file:
APT.createNamespace("application"); document.addEventListener("appMobi.device.ready",onDeviceReady,false); APT.ViewController = Class.$define("APT.ViewController"); application.ViewController = Class.$define("application.ViewController", APT.ViewController, { __init__: function() { this.$super(); };>});In the ViewController_View_774585933.css file, we have to change styles of element colors from black to white to be readable on the black background: color: rgba(0,0,0,1); à color: rgba(256,256,256,1);. As a result we get:
div#Label_590244915 { left: 20px; color: rgba(256,256,256,1); height: 21px; position: absolute; text-align: left; width: 320px; top: 0px; opacity: 1; } div#Label_781338720 { left: 20px; color: rgba(256,256,256,1); height: 21px; position: absolute; text-align: left; width: 42px; top: 29px; opacity: 1; } div#Label_463949782 { left: 20px; color: rgba(256,256,256,1); height: 21px; position: absolute; text-align: left; width: 42px; top: 51px; opacity: 1; } div#Label_817497855 { left: 20px; color: rgba(256,256,256,1); height: 21px; position: absolute; text-align: left; width: 42px; top: 74px; opacity: 1; } div#Label_705687206 { left: 70px; color: rgba(256,256,256,1); height: 21px; position: absolute; text-align: left; width: 42px; top: 29px; opacity: 1; } div#Label_782673145 { left: 70px; color: rgba(256,256,256,1); height: 21px; position: absolute; text-align: left; width: 42px; top: 51px; opacity: 1; } div#Label_1067317462 { left: 70px; color: rgba(256,256,256,1); height: 21px; position: absolute; text-align: left; width: 42px; top: 74px; opacity: 1; } div#View_774585933 { left: 0px; height: 548px; position: absolute; width: 320px; top: 20px; opacity: 1; }
After updating the emulator window, you see:
Clik here to view.

To code the accelerometer functions, we need to use the appMobi JavaScript Library. Documentation for this library can be found here. It’s installed when you download Intel XDK.
Open the index.html file and add this line into the list of scripts:
<script type="text/javascript" charset="utf-8" src="http://localhost:58888/_appMobi/appmobi.js"></script>
Open the ViewController_View_774585933.html file. We have to rename fields to more logical names from:
<div data-apt-class="Label" id="Label_705687206">0</div> <div data-apt-class="Label" id="Label_782673145">0</div> <div data-apt-class="Label" id="Label_1067317462">0</div>
to:
<div data-apt-class="Label" id="accel_x">0</div> <div data-apt-class="Label" id="accel_y">0</div> <div data-apt-class="Label" id="accel_z">0</div>
The same should be done in the ViewController_View_774585933.css file, where we have to rename the style names.
Open the viewcontroller.js file and write some functions for using the accelerometer.
function suc(a) { document.getElementById('accel_x').innerHTML = a.x; document.getElementById('accel_y').innerHTML = a.y; document.getElementById('accel_z').innerHTML = a.z; } var watchAccel = function () { var opt = {}; opt.frequency = 5; timer = AppMobi.accelerometer.watchAcceleration(suc, opt); } function onDeviceReady() { watchAccel(); } document.addEventListener("appMobi.device.ready",onDeviceReady,false);
Update the project, and you can see it on the emulator window:
Image may be NSFW.
Clik here to view.
You can see how the accelerometer works on Intel XDK using the “ACCELEROMETER” panel:
Image may be NSFW.
Clik here to view.
The application will look like this:
Image may be NSFW.
Clik here to view.
The complete application source code can be found here.
Appendix A: Intel ®XDK Code License Agreements
Appendix B: Intel Sample Source Code License Agreement
Intel and the Intel logo are trademarks of Intel Corporation in the U.S. and/or other countries.
Copyright © 2013 Intel Corporation. All rights reserved.
*Other names and brands may be claimed as the property of others.
HTML5 Canvas - Tap & Rotate Player with Arctangent
One of the more challenging user experiences in a game is the need to move AND aim a player on the screen. That gets harder with mobile devices, where you have limited controller options. One way to fix this, is to allow the user to tap where your character should aim than have him turn in that direction. Think of a gun turret or a spaceship where all you do is tap on enemies and the turret or spaceship turns and fires in that direction.
To do this this you simply need to know what direction your object or "Actor" is currently pointing, and the location of the the object want to point at. With a single line of Trigonometry it's a simple thing to code.
Here's is a quick Vine video on this approach (roll over your mouse to view)
Below are the steps to do this with code
Step 1. Figure A. : Canvas will rotate in radians. So rather than 360 degrees in a circle think the radians as and expression of PI. Half way around a circle is 3.14 radians, and all the around is 6.28 radians, or 2PI. So at any given point you should know your Actors Radian angle in Canvas. In my code I simply increment (spin) the canvas by +1 or -1 each frame. Thus it is easier for my "spin' variable to work with 360 degrees for a smooth animation. So I calculate from radians to degrees and back to radians in my code.
shipRadian=(spin * Math.PI/180); // this will turn 0-360 degrees into a radians from 0-6.28 context.rotate(shipRadian); //this will rotate the canvas to a radian
Step 2. Figure B. To point your Actor at an enemy or a touch event on the screen, you need to calculate the radian angle of that enemy in relation to the Actor. You can use a simple javascript math expression to do this "Math.atan2(DeltaY, DeltaX)". In my example, the touch event will place a crosshair symbol on the screen, and we will then fire at that crosshair. The code to calculate the radian we need looks like this.
xhairRadian=Math.atan2(touchY-shipy, touchX-shipx);
Step3: Figure C & D: The next thing to do is to subtract the new radian value by your actors radian. That delta will give you a number either smaller or larger than PI (3.14). Now for the best animation we want the actor in our scene to turn in a direction of the shortest path. Generally the way to consider this is if the delta radian is smaller than PI then the rotation toward the new radian value will be clockwise, if larger than PI the rotation toward the new radian value will need to be counterclockwise
Image may be NSFW.
Clik here to view.
Code for Figure C & Figure D
if(xhairRadian<=0){ // The arctangent math calulates a negative radian for half of the radians. This turns the negative radian into is positive counterpart xhairRadian=2*Math.PI+xhairRadian; } deltaRadian=xhairRadian-shipRadian // Determine the detla between the ship and new radian if (deltaRadian < -Math.PI || deltaRadian > Math.PI){ // determine if the delta is beyond 3.14 or -.3.14, if so turn right i.e. clockwise if(xhairRadian<shipRadian){ direction="right"; } if(xhairRadian>shipRadian){ direction="left"; } } else { // else if the difference in angle is positive spin toward the right if (xhairRadian > shipRadian) { direction = "right"; } if(xhairRadian<shipRadian){ // if the difference in angls is negative spin toward the left direction="left"; } } shotstart=1; // shotstart = 1 means we've finished the calculations and are ready spin and shoot } }
Step4: Figure E: The next thing to do, is to start incrementing the canvas rotation in the proper direction. Through some testing I found a static rate of movement creates a problem. Either the ship takes too long to go around, and the action isn't good. Or the ship moves too quickly in short distances, and it looks choppy. To fix this, I add an accelerated speed to the rotation, where each frame I increase the speed until it his a max speed. That creates fast and smooth action.
var speedmax =20; // our top rate of speed; if (shotstart==1){ //if the shot was made start to spin the ship if (direction=="left"){ spinspeed--; //if not at top speed then increase the speed of the ship turning in the negative direction if (spinspeed<(speedmax*-1)){ spinspeed = (speedmax * -1); //if you hit top speed don't increase the speed anymore } } else { spinspeed++; //if not at top speed then increase the speed of the ship turning in the positive direction if (spinspeed > speedmax) { //if you hit top speed don't increase the speed anymore spinspeed = speedmax; } spin+=spinspeed; // our spin number increases by the rate of spin spinspeed *= 1.6; // increase the spin rate by 60% each frame }
Step 5 Figure F: Because our randians and degrees go from 0 to 6.28 and 0 to 360, when you rotate counter clockwise and pass Zero you need to change the math. Since our"spin" variable is in degrees when we pass Zero we need to shift plus 360 rather than going negative. Also if going clockwise after you pass 360 you need to go to Zero rather than counting up pass 360. To manage this you'll need a piece of code that will either subtract or add 360 to the current spin, depending on the direction.
if (spin >= 360) { //if you've come all the way around, reset the spin by 360 spin = spin - 360; } if (spin <= 0) { //if you've come all the way around, reset the spin by 360 spin=spin+360; }
Image may be NSFW.
Clik here to view.
Step 6. Figure G. Ultimately when we get the Actor radian to match the new radian we want to stop spinning. However when dealing with math of fractional numbers it is hard to get a number to exactly equal another number. To make this easier all we do is add a buffer amount aound our target radian value. Thus if our Actor is close enough to pointing in the right direction we can go ahead and make it equal the target radian.
if (spinRound >=xhairRadianround-0.5 && spinRound <= xhairRadianround +0.5 || spinRound >Math.PI*2 || spinRound <0) {; //if the ships close enough to the proper angle no need to animate just point the ship at the cursor shipRadian=xhairRadian; spinspeed = spindefault; shotstart=0; } else { //if the angle is far enough off start to spin the ship shipRadian=(spin * Math.PI/180) }
Step 7. Figure H. When we've done this we've completed the task of rotating our Actor in the correct direction. With this done you can trigger the event, which in our case is firing a laser.
if (shipRadian==xhairRadian){ // we are pointed at the place we tapped, now fire the lasers drawShot(); shotprogress=true; // flag to say we completed the drawing our lasers }
That's all there is to this. Check it out yourself on my public drop box: Launch Example Open this with any HTML5 compatible device and tap around. View and copy the source to play with your own version.
Follow Bob Duffy on Twitter @bobduffy
Icon Image:
Clik here to view.

Third-Party Android Application Debug Reference on Intel® Processor-based Platforms Part 1
Contents
Third-Party Application Debug Reference on Intel® Processor-based Platforms
Introduction
Debug Tricks
Miscellanous Debugging Tricks MethodTracing
HProf (Heap Profile)
SamplingProfile
System Signal
Logcat
jdwp (Java debug wire protocol)
android.os.Debug
Target Device Side:
Host PC Side:
gdb multi-thread debug command
Debug Case: Debug service in system_server process
Debug Case: Debug Android app with native library[put a lead-in sentence as to what this section is for]
Android Core dump file Analysis
TroubleShooting in Eclipse*
How to use kprobe kernel debug
Intel GPA
Systrace [put a lead-in sentence as to what they can do in general with Systrace]
Matrix
Wuwatch
SEP(Sampling Enabling Product)
Kratos
Introduction
Developing applications is important for Intel® processor-based mobile platforms to be successful. For platform engineers and application engineers who want to enable applications as much as possible on Intel platform, there are no source code for applications from third-party ISV (e.g., Google ), there is a big question about how to debug these no source code applications on intel platform.
This document shows how the debugging experience, detailed methology, and tool usage for debugging no–source-code third-party applications on Intel processor-based platforms.
Debug Tricks
Call Stack
Description:
The call stack is important for debugging because it tells you where the bug occurs in the source code. It’s a running history, if you will. There are call stacks for Java* space and native space and different ways to print them as the following paragraphs show.Print Java Space Call Stack:
Method that will not break the program which you are debugging.
Import android.util.Log; void printJavaCallStack() { java.util.Map<Thread, StackTraceElement[]> ts = Thread.getAllStackTraces(); StackTraceElement[] ste = ts.get(Thread.currentThread()); for (StackTraceElement s : ste) { Log.d("zwang",s.toString()); } }
Method that will break program, so do not use.
new RuntimeException("stack").printStackTrace();
Print Native Space Call Stack:
Method that will not break the program which you are debugging.
include <utils callstack="" h=""> using namespace android; namespace android { void get_backtrace() { CallStack stack; stack.update(); stack.dump(""); } };
Method that will break the program, so do not use until it if necessary
int* p = NULL; *p = 0x8888;
Print Stack from Native Space to Java Space
Apply patch 0001-Dalvik-add-support-of-print-Java-Stack-from-Native-s.patch into Dalvik project.
Make Dalvik project and push libdvm.so into /system/lib on the device.
After reboot, you can use Dalvik’s interface in two ways to dump the stack from native space to Java space of the process into the /sdcard/logs/ javastack file
By shell command:
kill -31 <pid>
By API Interface:
Add sentence “kill(getpid(),31);” at that point in the source code where you want to dump the stack from native space to java space
For example:
<JB>/frameworks/native/libs/binder/IServiceManager.cpp virtual sp<IBinder> getService(const String16& name) const { kill(getpid(),31); … }
Check the Java stack in /sdcard/logs/ javastack on the device. You can find the whole call stack from native space to java space, then you will know what java function and native library are called.
root@android:/sdcard/logs # cat javastack ----- pid 25653 at 1982-01-01 02:15:14 ----- Cmd line: com.android.providers.calendar DALVIK THREADS: (mutexes: tll=0 tsl=0 tscl=0 ghl=0) "main" prio=5 tid=1 NATIVE | group="main" sCount=0 dsCount=0 obj=0x417c2550 self=0x417b2af0 | sysTid=25653 nice=0 sched=0/0 cgrp=apps handle=1074057536 | schedstat=( 13633356 12645753 23 ) utm=0 stm=1 core=1 #00 pc 000b01ad /system/lib/libdvm.so #01 pc 000907ee /system/lib/libdvm.so #02 pc 00091ad4 /system/lib/libdvm.so #03 pc 0008a33d /system/lib/libdvm.so #04 pc 00000400 [vdso] at android.view.Display.init(Native Method) at android.view.Display.<init>(Display.java:57) at android.view.WindowManagerImpl.getDefaultDisplay(WindowManagerImpl.java:630) at android.app.ActivityThread.getDisplayMetricsLocked(ActivityThread.java:1530) at android.app.ActivityThread.applyConfigurationToResourcesLocked(ActivityThread.java:3649) at android.app.ActivityThread.handleBindApplication(ActivityThread.java:3969) at android.app.ActivityThread.access$1300(ActivityThread.java:130) at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1255) at android.os.Handler.dispatchMessage(Handler.java:99) at android.os.Looper.loop(Looper.java:137) at android.app.ActivityThread.main(ActivityThread.java:4745) at java.lang.reflect.Method.invokeNative(Native Method) at java.lang.reflect.Method.invoke(Method.java:511) at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:786) at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:553) at dalvik.system.NativeStart.main(Native Method)
patch 0001-systemcore-add-Dalvik-Tombstone-call-stack-support.patch for system/core project is optional , it just add tomestone with print java stack into /sdcard/logs/ javastack.
0001-Dalvik-add-support-of-print-Java-Stack-from-Native-s.patch 0001-systemcore-add-Dalvik-Tombstone-call-stack-support.patch
Log Tools
logcat - Android* log message application
Process interface:
cat /proc/kmsg – kernel debug message /proc/cpuinfo /proc/meminfo /proc/iomem /proc/vmallocinfo /proc/interrupts /proc/sys/vm/drop_caches
Information dump:
procrank: process memory rank procmem: a specific process’ memory showslab: kernel slab utilization, /proc/slabinfo latencytop: CONFIG_LATENCYTOP showmap: process memory mmap address space; /proc/XXX/maps dumpstate- system information like memory , cpu etc dumpsys – system service information etc to see all of the "subcommands" of dumpsys do: dumpsys | grep DUMP DUMP OF SERVICE SurfaceFlinger: DUMP OF SERVICE accessibility: DUMP OF SERVICE account: DUMP OF SERVICE activity: DUMP OF SERVICE alarm: DUMP OF SERVICE appwidget: DUMP OF SERVICE audio: DUMP OF SERVICE backup: DUMP OF SERVICE battery: DUMP OF SERVICE batteryinfo: DUMP OF SERVICE clipboard: DUMP OF SERVICE connectivity: DUMP OF SERVICE content: DUMP OF SERVICE cpuinfo: DUMP OF SERVICE device_policy: DUMP OF SERVICE devicestoragemonitor: DUMP OF SERVICE diskstats: DUMP OF SERVICE dropbox: DUMP OF SERVICE entropy: DUMP OF SERVICE hardware: DUMP OF SERVICE input_method: DUMP OF SERVICE iphonesubinfo: DUMP OF SERVICE isms: DUMP OF SERVICE location: DUMP OF SERVICE media.audio_flinger: DUMP OF SERVICE media.audio_policy: DUMP OF SERVICE media.player: DUMP OF SERVICE meminfo: DUMP OF SERVICE mount: DUMP OF SERVICE netstat: DUMP OF SERVICE network_management: DUMP OF SERVICE notification: DUMP OF SERVICE package: Permission [android.permission.DUMP] (49f43060): perm=Permission{49fc39e0 android.permission.DUMP} android.permission.DUMP DUMP OF SERVICE permission: DUMP OF SERVICE phone: DUMP OF SERVICE power: DUMP OF SERVICE reboot: DUMP OF SERVICE screenshot: DUMP OF SERVICE search: DUMP OF SERVICE sensor: DUMP OF SERVICE simphonebook: DUMP OF SERVICE statusbar: DUMP OF SERVICE telephony.registry: DUMP OF SERVICE throttle: DUMP OF SERVICE usagestats: DUMP OF SERVICE vibrator: DUMP OF SERVICE wallpaper: DUMP OF SERVICE wifi: DUMP OF SERVICE window: dumptcp – tcp/ip information bugreport
Wakelock
Description:
A locked wakelock, depending on its type, prevents the system from entering suspended or other low-power states. When creating a wakelock, you can select its type. If the type is set to WAKE_LOCK_SUSPEND, the wakelock prevents a full system suspend. If the type is WAKE_LOCK_IDLE, low-power states that cause large interrupt latencies, or that disable a set of interrupts, will not be entered from idle until the wakelocks are released. Unless the type is specified, this document refers to wakelocks with the type set to WAKE_LOCK_SUSPEND.If the suspend operation has already started when locking a wakelock, it will abort the suspend operation as long it has not already reached the suspend_late stage. This means that locking a wakelock from an interrupt handler or a freezeable thread always works, but if you lock a wakelock from a suspend_late handler you must also return an error from that handler to abort suspend.
Debug Method:
To check the wakelock status, use cat /proc/wakelocks
name – the component that holds wakelock
wake_count – the count of holding wakelock
active_since – the time interval from the last time holding the wakelockImage may be NSFW.
Clik here to view.Tools:
CPUSpy.apk –Use this application to get the device’s deep sleep time and to find out whether the device has a problem going into deep sleep.
get_activewakelock.sh – Use this script to get the name and active_since columns from /proc/wakelocks
Both CPUSpy.apk and get_activewakelock.sh are attached as following:get_activewakelock.sh CPUSpy.apk
Miscellanous Debugging Tricks MethodTracing
Use MethodTracing to find hot spots and analyze performance. You can also check CPU usage, function call times, etc.
Follow these steps to do a trace:
import android.os.Debug; …… android.os.Debug.startMethodTracing(“/data/tmp/test”); // create /data/tmp …… // the program to be trace here android.os.Debug.stopMethodTracing();
after running, there will be trace file in /data/tmp/test.trace
copy trace file into PC host.
$ adb pull /data/tmp/test.trace ./
use trace tool in Android SDK for trace analysis.
$ $ANDROID_SRC/out/host/linux-x86/bin/traceview test.trace $ANDROID_SRC/out/host/linux-x86/bin/dmtracedump -g test.png test.trace
Note:
There is a conflict between trace creation and the DEBUG version libdvm. Use the trace method only for the non-DEBUG version build.
HProf (Heap Profile)
Use HProf to analyze Java memory , show Dalvik memory usage , memory leakage , etc.
Follow these steps to do a heap profile:
import android.os.Debug; import java.io.IOException; …… try { android.os.Debug.dumpHprofData(“/data/tmp/input.hprof”); // create /data/tmp } catch (IOException ioe) { }
copy hprof file into PC host
$ adb pull /data/tmp/input.hprof ./
use hprof-conv to turn hprof into format by MAT tool
$ $ANDROID_SRC/out/host/linux-x86/bin/hprof-conv input.hprof output.hprof
use MAT to open hprof file to check result
MAT link: http://www.eclipse.org/mat/downloads.php
Note:
The tool only shows Java spaceb, not native space, memory usage.
SamplingProfile
do sample at millisecond interval for routine, then output sample log.
Follow these steps to do sample profile
import dalvik.system.SamplingProfiler …… SamplingProfile sp = SamplingProfiler.getInstance(); sp.start(n); // n is sample times sp.logSnapshot(sp.snapshot()); …… sp.shutDown(); //there will be a sample thread to output information in logcat
System Signal
Use this tool to send system signal SIGQUIT and SIGUSR1 to Dalvik, which will handle these signals (dalvik/vm/SignalCatcher.c) to print the call stack or memory usage.
Follow these stpes to send system signal and get call stack
$ chmod 777 /data/anr -R $ rm /data/anr/traces.txt $ ps # find pid $ kill -3 pid # send SIGQUIT to process to get tracefile $ cat /data/anr/traces.txt
$ chmod 777 /data/misc -R $ ps # find pid $ kill -10 pid # send SIGQUIT to process to get hproffile $ ls /data/misc/*.hprof
Logcat
Use this tool to get aplog print from android system.
You can use following methods to add aplog into or get aglog.
android.util.Log uses println for Jjava output with I/V/D….
Dalvik uses pipe and thread., Use dup2 to make stdoutand stderr re-direction to pipe (vm/StdioConverter.c:dvmstdioConverterStartup), start a thread to read pipe (dalvik/vm/StdioConverter.c:stdioconverterThreadStart()), then use the LOG tool to output the log into(system/core/liblog/logd_write.c: __android_log_print())/dev/log/*
The parameters for the logcat tool are:
# logcat -b main //show main buffer # logcat -b radio //show radio buffer # logcat -b events //show event buffer
jdwp (Java debug wire protocol)
The Java Debug Wire Protocol (JDWP) is the protocol used for communication between a debugger and the Java virtual machine (VM) which it debugs . In Android system, JDWP is proocal used between adb and java application on android device. Developer can use it for many debug proposal.
Go to this link for more information:
http://docs.oracle.com/javase/1.5.0/docs/guide/jpda/jdwp-spec.html
android.os.Debug
Android’s Debug tool, android.os.Debug, has many debug APIs. Go to this link for more information:
More information can be found in link:
http://developer.android.com/reference/android/os/Debug.html
Get nano level time
threadCpuTimeNanos()
Get memory allocation
startAllocCounting() stopAllocCounting() getGlobalAllocCount() get…..
Print the classes loaded in current process
getLoadedClassCount() printLoadedClasses() //it needs to open NDEBUG function
Debug Tools
Powerful debug tools help developer to root cause issue quickly and easily . This chapter will introduce typical android debug tools and technique about how to use them to root cause issues.
GDB
Print log is one way to debug Android apps, but it is inefficient and difficult to use.
Gdb is a good tool for debugging in single step and looking directly into source code issues. This section explains how to use the gdb tool on Android platforms.
Target Device Side:
gdbserver :<port> –attach <PID>
Host PC Side:
- adb forward tcp:<port> tcp:<port>
- cd <your code base root directory> , so gdb can find the source code in the current work path.
- Run the command:gdb <program to debug> (the program should first be compiled with -g switch).
- Start debugging the program.
- Setup library patch with gdb command using these two commands:
- #set solib-absolute-prefix < path of symbols> (be careful to not have a special sign in the patch ex: ~)
- #set solib-search-path < path of lib under symbols >
To connect to the gdbserver on the target side, run <gdb> target remot :<port>. )
Note regarding the program/library with the debug symbol:Although defaultly android build system use “-g” switch to build native library with debug symbols, it strips the debug symbols at the last build stage. So to use native library with debug symbols, you need to use the one in the “out/target/product/symbols” directory.
gdb multi-thread debug command
Gdb tool also provide commands to debug multi-thread in one process, use the following commands to do:
info threads – print all thread information for the program you are debugging
thread <tid> – switch to debugging this thread with the specified ID.
break <file name>:<line> – set break point in source code file at the specified line. This command is very useful for system_servers that have many threads.
For example, following command will set a bread point in InputDispatcher thread of system_servers process
break InputDispatcher.cpp:1280, then continue,
To debug step by step, touch the screen at the point where you want, gdb will to stop the InputDispatcher thread.
set scheduler-locking off|on|step - When you debug mult-threads, you will find many other threads are running at the same time. To debug the current thread, use the “step”/”continue” command. By using “set scheduler-locking”, you can make your current debug thread the only running thread.
off – do not lock any thread, all threads are running, which is the default.
on – only the current debug thread is running.
step – doing debug step by step, except using “next” command, Only the current debug thread is running.
Debug Case: Debug service in system_server process
This debug case shows how to debug service thread in system_server process by Gdb tool
On the target device, type
adb shell ps | grep system_server (ex: system_server PID is 312) gdbserver :1234 –attach 312
On the host PC, type:
adb forward tcp:1234 tcp: 1234 cd ~/ics gdb out/target/product/mfld_pr2/symbols/system/bin/app_process #set solib-absolute-prefix /home/zwang/ics/out/target/product/mfld_pr2/symbols #set solib-search-path /home/zwang/ics/out/target/product/mfld_pr2/symbols/system/lib #target remot :1234
Gdb will load symbols from libraries
# break InputDispatcher.cpp:1280 # continue
Touch the screen at the point where you want gdb to stop the InputDispatcher thread so you can debug by steps.
Debug Case: Debug Android app with native library [put a lead-in sentence as to what this section is for]
This case shows how to debug android application with native library by Gdb tool
On the target device, type
adb shell ps | grep zwang.test.app ( ex: the app PID is 123) gdbserver :1234 –attach 123
On the host PC, type:
adb forward tcp:1234 tcp: 1234 cd ~/ics gdb out/target/product/mfld_pr2/symbols/system/bin/app_process #set solib-absolute-prefix /home/zwang/ics/out/target/product/mfld_pr2/symbols #set solib-search-path /home/zwang/ics/out/target/product/mfld_pr2/symbols/system/lib:/home/zwang/app/obj/local/x86
When you build a native library with ndk-build, the library with debug symbols is located in the obj directory. (library under lib directory is striped out of debug symbols ). In our case, it is /home/zwang/app/obj/local/x86, so we need to add this path into the library search path of gdb solib-search-path.
#target remote :1234 to connect with gdbserver
#break zwangjni_test_app.c : 12 set break point, you will get a message like “No source file named zwangjni_test_app.c, Make breakpoint pending on future shared library load?“ Y
#continue
After stopping at the break point in the native library,you can debug by steps.
Android Core dump file Analysis
When there are program exceptions, a core dump file will be created and located in /mnt/sdcard/data/logs/crashlogxx/xxxx_xxx_xxx.core. Use adb pull to copy the core dump file to the host PC.
To load the core dump file, run command: gdb <ics>/out/target/product/mfld_pr2/symbols/system/bin/app_process xxxx_xxx_xxx.core
Set symbols path
#set solib-absolute-prefix /home/zwang/ics/out/target/product/mfld_pr2/symbols
#set solib-search-path /home/zwang/ics/out/target/product/mfld_pr2/symbols/system/lib
Then you can use commands like bt, frame, up, down, and print to check the call stack when the program has exceptions.
TroubleShooting in Eclipse*
Eclipse is an useful integrated development environment tool for Android application development. Sometimes you will meet some strange error when using it, this section will show some typical problem and tell you how to resolve it.
When you use main menu Eclipse->preferences->Android, if you see “bad version number in .class file” error message. It is due to the Eclipse environment variable having the wrong Java run -time version number.
Go to Help->About->Installation Details, to check the Eclipse environment variable and set it correctly.
kprobe]
kprobe is the Linux* kernel debug tool, it can provide developer with printing kernel level debug log.
How to use kprobe kernel debug
Follow the below steps to print kernel level log to dmesg buffer pool.
Copy kprobes sample code into the Intel driver directory for building kernel module
cd ~/aosp/hardware/intel/linux-2.6/drivers/misc cp –r /AOSP/hardware/intel/linux-2.6/samples/kprobes
Change the makefile to build the kprobe sample kernel module by typing the lines below in red text.
wang@~/r4_1_stable/hardware/intel/linux-2.6/drivers/misc >git diff diff --git a/drivers/misc/Makefile b/drivers/misc/Makefile index 166a42e..6ef0f1d 100755 --- a/drivers/misc/Makefile +++ b/drivers/misc/Makefile @@ -3,6 +3,7 @@ # intel_fabric_logging-objs := intel_fw_logging.o intel_fabricerr_status.o +obj-m += kprobes/ obj-$(CONFIG_IBM_ASM) += ibmasm/ obj-$(CONFIG_AD525X_DPOT) += ad525x_dpot.o obj-$(CONFIG_AD525X_DPOT_I2C) += ad525x_dpot-i2c.o diff --git a/samples/kprobes/Makefile b/samples/kprobes/Makefile index 68739bc..8f253fc 100644 --- a/samples/kprobes/Makefile +++ b/samples/kprobes/Makefile @@ -1,5 +1,8 @@ # builds the kprobes example kernel modules; # then to use one (as root): insmod <module_name.ko> +CONFIG_SAMPLE_KPROBES=m +CONFIG_SAMPLE_KRETPROBES=m + obj-$(CONFIG_SAMPLE_KPROBES) += kprobe_example.o jprobe_example.o obj-$(CONFIG_SAMPLE_KRETPROBES) += kretprobe_example.o
Make bootimage to build the kprobe sample kernel module, then you can find it in:
out/target/product/mfld_pr2/kernel_build/drivers/misc/kprobes/kretprobe_example.ko out/target/product/mfld_pr2/kernel_build/drivers/misc/kprobes/kprobe_example.ko out/target/product/mfld_pr2/kernel_build/drivers/misc/kprobes/jprobe_example.ko
re-flash phone images including boot.bin and system.img to make magic number consist between boot.bin and kprobe modules, otherwise you will fail to insert kprobe modules into kernel.
To find a kprobe kernel message in /proc/kmsg, type insmod kprobe_example.ko.
Performance Tools
Performance issues have always been a headache for developers. Fortunately, there are some tools to help us. Here we introduce Intel® Graphics Performance Analyzers (Intel® GPA), Systrace, Matrix, Wuwatch, SEP, and Kratos.
Intel GPA
GPA tool can be used to dump many useful information in device like : CPU frequency, FPS, memory usage, network usage, Opengl texture, etc.
To install Intel GPA, follow these steps:
- download GPA tool from link: http://mcgwiki.intel.com/wiki/?title=GPA_usage
- Unzip gpa_13.1_release_194259_ubuntu.zip
- Type cd gpa_13.1_release_194259_ubuntu
- Do one of the following, depending on which instruction set you are targeting:
=>x86 system:
$ chmod 755 intel-gpa_13.1_x86.deb.shar $ ./ intel-gpa_13.1_x86.deb.shar
=>x64 system:
$ chmod 755 intel-gpa_13.1_x64.deb.shar $ ./ intel-gpa_13.1_x64.deb.shar
4. Double tap intel-gpa_13.1_m64.deb, and complete the installation.
To use Intel GPA:
$ gpa-system-analyzer.
Image may be NSFW.
Clik here to view.Figure 2-1
Connect the target device using a USB connection, and Intel GPA will recognize the device. Click the “Connect” button to connect the device, and the Intel GPA screen like the one shown in Figure 2-2 will display.
Image may be NSFW.
Clik here to view.Figure 2-2
Image may be NSFW.
Clik here to view.Figure 2-3
To launch an app on device, click on the name of the app in Intel GPA. The monitored actions include: CPU, Device IO, GPU, Memory, Memory Bandwidth, OpenGL*.
Image may be NSFW.
Clik here to view.Figure 2-4
To analyze the results:
Figure 2-4 shows the actions that are being monitored in Intel GPA, including CPU 01 Frequency, CPU 02 Frequency, Disk Write, Target App CPU Load, and CPU 01 Load. The frequency of the CPU Core 1 is 2.0 Ghz, and the load of CPU Core 1 is 100%. With this tool, you can also find out if there are some exceptions with the CPU, GPU, etc.
Systrace [put a lead-in sentence as to what they can do in general with Systrace]
The systrace tool helps analyze the performance of your application by capturing and displaying execution times of your applications processes.
Google’s Systrace tool is supported in Android OS versions JellyBean and above. Use the following links to download the Systrace tool , which is in the SDK package.
- http://developer.android.com/tools/help/systrace.html
- http://developer.android.com/tools/debugging/systrace.html
Follow these steps to use the Systrace tool.
Set up Systrace for the particular Android device you are targeting.
- $ python systrace.py --set-tags
sync,gfx,camera,video,input,webview,audio,am,wm,view - $ adb shell stop; adb shell start
NOTE: You can set the trace tags for systrace using your device's user interface by navigating to Settings->Developer options->Enable traces. Select the options you want from the list and click ok.
- $ python systrace.py --set-tags
Profile android application
To get a systrace log of 10 seconds, do following command
$ python sytrace.py –d –f –i –l –t 10 –o mysystracefile.html-o <FILE>, specifies the file to write the HTML trace report to.
-t N, traces activity for N seconds. Default value is 5 seconds.
-l, traces the CPU load. This value is a percentage determined by the interactive CPU frequency governor.
-I, traces the CPU idle events.
-f, traces the CPU frequency changes. Only changes to the CPU frequency are logged, so the initial frequency of the CPU when tracing starts is not shown.
-d, traces the disk input and output activity. This option requires root access on the device.
Note: After executing the above command, you have 10 seconds to profile the current android application.
To check the profile results:
Opening mysystracefile.html, refer to figure 3-1. Using following keys to operate system trace diagram.
“w” key : Zoom into the trace timeline
“s” key: Zoom out of the trace timeline
“a” key: Pan left on the trace timeline
“d” key: Pan right on the trace timeline
Image may be NSFW.
Clik here to view.Figure 3-1
To analyze the results:
The time: 4520 ms~4820 ms,
The CPU frequency of the thread 6803(UnityMain) is about 800 Mhz.
The event marked in the black in Figure 3-2 takes about 18 ms. By comparing the value with other devices, you can find out if there is a difference when dealing with the same event.
The thread was running in different CPU cores, and switched at least twice: CPU core 1-> CPU core 2->CPU core 1… If a thread switches CPU cores frequently, it will affect the performance of device.
Image may be NSFW.
Clik here to view.
Figure 3-2
For more information, you can refer to this link: http://developer.android.com/tools/debugging/systrace.html
Matrix
Matrix is a tool to measure power and performance (PnP) on Intel processor-based mobile platforms. The data capture methodology and information on the internal counters is Intel property and shouldn’t be distributed externally.
Tool download link: http://mcgwiki.intel.com/wiki/?title=PnP_Matrix_Tool_Setup
Unzip the Matrix 3.4.3.zip, which contains three files:
Driver: For Android versions 4.0+, this is not useful so you can ignore it.
matrix: This is the tool we will push to the target device we are testing.
MatrixUserGuide-3.4.4: the User Guide for matrix.
To use push Matrix to the target device, type these commands:
$ adb root $ adb remount $ adb shell # cd data # mkdir Matrix $ adb push <dir>/matrix /data/Matrix
Run matrix to get data from the platform
Matrix should have time and at least one feature as a mandate
Usage./matrix –f <feature> –t <time-in-seconds>
Here ./matrix is the matrix tool, -f refers to the features, and -t is the time. The time duration is in terms of seconds. The minimum is 1 second and the maximum is 3600 seconds. In the above example -t 20 means 20 seconds. This will create a default output file by the name MatrixOutput.csv.
./matrix –f cstate –t 120 –o filename
This command will store the output into filename.csv (user specified name).
Note: When using –o, only give the file name without any extensions. Matrix will automatically append .csv after post-processing.
Case study
One case to show how to use matrix to capture target device feature value
If you want to get all the supporting features of matrix, you can use command “./matrix –h ” to get them.
If you want to get multiple features at the same time, you can use a command like this:
./matrix –f <feature1 > –f <feature2 > –f <feature3 > -t <time-in-seconds>
ex: “./matrix –f cstate –f pfmres –f power-max –f intr –t 300”
you will get report document after this command to show capture result.
Wuwatch
Wuwatch is a command line tool for both tracing and monitoring system power states. It traces C-state (processor power), S-states (S0ix and S3 system states), D0ix (device or IP block) states, both user and kernel wakelocks, and P-state (processor frequency) activity. While tracing C-states, it attempts to determine the cause of every C-state wakeup, which is a transition to a higher power state.
Tool download link:
http://wiki.ith.intel.com/display/wuwatch/WakeUp+Watch+Power+Analysis+Tools
There are 9 files in Matrix 3.4.3.zip, we need to pay attention to the two files of them.
Summary_data_v3_1.py : This is a Python* script that generates summary data from the wuwatch raw text trace output..
WakeUpWatchForAndroid : The Wuwatch User Guide.
Integration with Android Distributions.
The driver and binary file of wuwatch are now integrated into many Android distributions. Before using the tool to get raw data, you must do some initialization.
$ adb root
$ adb remount
$ adb shell
# cd /lib/modules/
# insmod apwr3_1.ko
# lsmod (Check the result. See Figure 5-1 for an example.)
Image may be NSFW.
Clik here to view.Figure 5-1
$ mkdir /data/wuwatch
$ cp /system/bin/wuwatch /data/wuwatch
$ cp /system/bin/wuwatch_config.txt /data/wuwatch (Check the result. See Figure 5-2 for an example.)
Image may be NSFW.
Clik here to view.Figure 5-2
Get the raw data from DUT.
Use the following steps to quickly collect C-state, P-state, and wakelock data for 60s on an Android- based system.
$adb root
$adb remount
$adb shell
#cd /data/wuwatch/
#./wuwatch –cs –ps –wl –t 60 –o ./results/test
#cd results (After 60s, check the results. An example is shown in Figure 5-3.)
Image may be NSFW.
Clik here to view.Figure 5-3
# exit
$adb pull /data/wuwatch/results/ <pc-local-dir>/ (Check the result. See Figure 5-4 for an example.)
Image may be NSFW.
Clik here to view.Figure 5-4
Summarize the results.
Before summarizing the results you must confirm that the Python27 has been installed on your PC (Windows* or Linux).
Copy summary_data_v3.1.py to the same directory with “test.txt” and “test.ww1.”.
$ c:\Python27\python <local-dir>\summary_data_v3.1.py –f <local-dir>\test.txt
--txt –o <local-dir>\test-summary.txt (Check the results.)
test-summary.txt.txt
SEP(Sampling Enabling Product)
Sampling Enabling Product (SEP) is a performance tool used for analyzing performance and tuning software on all Intel processor-based platforms. The SEP tool supports event-based sampling and counting through CPU performance counters. The tool consists of a collection portion (sep) and an output portion (sfdump5).
SEP collection overhead is extremely low (< 2% at default sampling frequencies).
Tool Download link: http://mcgwiki.intel.com/wiki/?title=SEP_for_gingerbread
How to use SEP
- Connect an Android device by USB
- On the host machine, unzip sep3_android_ia32.tar.bz2
- On host machine, cd to sep3_android_ia32/lin-install folder
- On the host machine, type:
$ adb root $ adb remount $ . sep-install.sh
Image may be NSFW.
Clik here to view.Figure 6-1
- On the host machine, type:
$ adb root $ . sep-start-driver.sh
Image may be NSFW.
Clik here to view.Figure 6-2
This will enable the SEP kernel module on the Android device.
- On the Android device, adb root and adb shell, and use SEP to do profiling.
# cd /data/sep3.9/bin # source setup_sep_runtime_env.sh # ./sep -start -d 20 -out real_prof
Image may be NSFW.
Clik here to view.Figure 6-3
NOTE: After the command executes, complete your operating on program needed to be profiled in 20 seconds. After 20 seconds of profiling, real_prof.tb6 file will be generated. If you want to get all the supporting features of SEP, use the command “./sep –help.”
The file real_prof.tb6 is sep profile result:
real_prof.tb6
- Analyze the results:
(1). Use the SFDUMP5 tool to analyze SEP profile results.
# sfdump5 real _prof.tb6 –modules | less
If you want to get all the supporting features of sfdump5, you can use the “./sfdump5 ” command.
(2). Use the VTune™ Analyzer to analyze profile results.
You need to purchase this tool. To download and install, go to: http://vtune.intel.com
Kratos
Kratos is an Intel-developed tool that monitors Android application system resource utilization, broadcasts system messages (aka Android intents), and checks battery activity and platform thermals.
Kratos uses the collected data to measure power consumption of the entire device or estimate power consumption of different platform components, which are displayed with run-time and post-processed graphs and as averages or totals in a table. System broadcast messages are overlaid on the graphs to provide workload context, enabling to draw conclusions for a specific workload’s power consumption profile.
Kratos is integrated by default into the main (R4) userdebug and eng branches of the JB PSI Android build.
[How to use]
Launch Kratos from Android Launcher application
Click the button of “Start Manual Profiling”(see Figure 7-1 )
Image may be NSFW.
Clik here to view.Figure 7-1
Select the option that is need to Monitor at the table “DATA”(eg: Figure 7-2 )
Image may be NSFW.
Clik here to view.Figure 7-2
Set the duration of getting data(eg: Figure 7-3 )
Image may be NSFW.
Clik here to view.Figure 7-3
Click the Start button to get data from target device (eg: Figure 7-4 )
Image may be NSFW.
Clik here to view.Figure 7-4
Click the button of “Start” to get data from DUT
If you did not enter a value for “DURATION”, you must Stop Profiling manually by clicking the Stop Profiling button, shown in Figure 7-4.
When profiling stops either by the setting or manually, you must confirm the action and save the results by clicking Yes (see Figure7-5 and Figure7-6).
Image may be NSFW.
Clik here to view.Figure 7-5
Image may be NSFW.
Clik here to view.Figure 7-6
Click the Load Session button to load the data of testing as shown in Figure 7-7.
Image may be NSFW.
Clik here to view.Figure 7-7
Select the data that you want to analyze (see Figure 7-8), then click the Load button.
Image may be NSFW.
Clik here to view.Figure 7-8
Check the results in the graph, like the one shown in Figure 7-9
Image may be NSFW.
Clik here to view.Figure 7-9
Check the results with Stats as shown in Figure 7-10
Image may be NSFW.
Clik here to view.Figure 7-10
Intel® HAXM vs. Microsoft® Hyper-V: risolvere il conflitto
Premessa
Intel® Beacon Mountain è un potente strumento realizzato per tutti gli sviluppatori software che desiderano realizzare apps destinate a dispositivi mobili basati su sistema operativo Android.
Questo tool permette, infatti, di installare in pochissimi semplici passi tutta la suite di strumenti necessari al fine di sviluppare nativamente applicazioni e di cross-compilarle sia per device basati su processore ARM*, che per device basati su architetture Intel® (e.s. Intel® Atom).
Il tool Intel® Beacon Mountain è reperibile nella sezione "Strumenti e Download" dell'area dedicata ad Android sul portale di Intel® Software.
Intel® Beacon Mountain fornisce anche un ambiente di emulazione per effetturare il deploy e testare le performance ed il corretto funzionamento delle applicazioni su dispositivi di varia natura, con differenti risoluzioni dello schermo, varie versioni dell'OS Android, diversi processori e diverse quantità di memoria RAM intallata.
Nelle prime fasi di installazioni del tool,.l'installer di Beacon Mountain testerà la possibilità di installare anche un componente chiamato Intel® HAXM (o, per esteso, Intel® Hardware Accelerated Execution Manager): questo software è un hardware-assisted virtualization engine (detto anche hypervisor) che utilizza la Intel® Virtualization Technology (Intel® VT) per velocizzare a livello hardware l'esecuzione dell'emulatore Android su una macchina host.
Su una stessa macchina però non possono convivere differenti hypervisor engine attivi. Per questo alcuni sviluppatori hanno trovato difficoltà nell'installare il componente Intel® HAXM di Intel® Beacon Mountain se sul loro pc era già stato installato ed abilitato un altro hypervisor, come ad esempio Microsoft* Hyper-V, strumento fondamentale per lo sviluppo, ad esempio, di applicazioni Windows* Phone.
É comunque importante sottolineare che l'installazione di Intel® Beacon Mountain può essere portata a termine anche senza necessariamente installare Intel® HAXM, viene solo notificato da parte dell'installer l'impossibilità di installare questo particolare componente, senza per questo precludere il portare a termine con successo l'installazione degli altri strumenti contenuti nella suite: questo però potrenbbe implicare una scarsa resa nelle performance di responsività dell'emulatore Android che Intel® Beacon Mountain andrà ad installare.
Ma allora com'è possibile risolvere il conflitto tra questi hypervisor, nel particolare tra Intel® HAXM e Microsoft* Hyper-V? Vediamo!
RISOLVERE IL CONFLITTO
Un modo semplice e veloce per risolvere il conflitto consiste in un'operazione semplice, ma molto al tempo stesso molto delicata. Infatti, sarà necessario intervenire con BCDEDIT per creare due differenti opzioni di boot di Windows* 8: la prima sarà la nostra opzione tradizionale che abiliterà di default (se presente/installato) Hyper-V, la seconda opzione ci consentirà di avviare Windows* 8 in una modalità in cui il demone Hyper-V sarà disabilitato, permettendoci di installare ed utilizzare in tutta tranquillità e senza conflitti tutta la suite di strumenti presenti in Intel® Beacon Mountain (e quindi anche l'accelerazione hardware regalataci da Intel® HAXM).
ATTENZIONE: un errore nella procedura descritta in seguito potrebbe costare il corretto avvio del vostro sistema operativo Windows. Nè Intel® nè l'autore della guida potranno essere ritenuti responsabili per eventuali malfunzionamenti dati dal sistema.BE CAREFUL!
Step 1.
Aprite un prompt dei comandi con privilegi di amministratore:
Image may be NSFW.
Clik here to view.
Step 2.
Nel prompt appena aperto digitiamo come segue:
C:\>bcdedit /copy {current} /d "Windows 8.1 Senza Hyper-V"
E ci sarà restituito un messaggio del tipo (ad esempio):
The entry was successfully copied to {08e28906-0ab9-11e3-9b2f-402cf41953d5}.
In cui {08e28906-0ab9-11e3-9b2f-402cf41953d5} corrisponde ad un GUID univoco del vostro sistema.
Step 3.
Ora possiamo modificare l'entry secondaria di boot appena creata disabilitandone però Microsoft Hyper-V, digitando semplicemente:
bcdedit /set {vostro-GUID} hypervisorlaunchtype off
Continuando l'esempio fatto in precedenza con il GUID risultante il comando da digitare diventerebbe:
bcdedit /set {08e28906-0ab9-11e3-9b2f-402cf41953d5} hypervisorlaunchtype off
Step 4.
Ora possiamo chiudere il promt e riavviare il sistema.
Dopo la prima fase di boot, cioè il caricamento del bios, il Windowsw 8 boot manager ci offrirà due opzioni di boot: la prima è il nostro vecchio sistema Windows 8 con abilitato Hyper-V, la seconda voce sarà sempre il nostro vecchio sistema Windows 8, ma il servizio Hyper-V sarà inibito ad essere avviato.
Grazie mille al Microsoft* MVP (Virtual Machine Expert) Francesco Valerio Buccoli per la preziosa consulenza tecnica.
Icon Image:
Clik here to view.

Porting the Havok Vision Engine to Android* Platforms
by Carla Brossa
Downloads
Porting the Havok Vision Engine to Android* Platforms [PDF 761KB]
The revolution of mobile platforms
The earliest attempt I know of porting a 3D engine to a real phone was that of Superscape, back in the very early 2000s. They were working with a number of OEMs to try to make their Swerve engine run on an ARM7. Those phones’ CPUs ran at about 40 MHz and included no cache. The content they could run on those devices was a maximum of 40 polygons, flat-shaded, with no texture and no z-buffer. It was a challenge for any artist! By comparison, early smartphones like the Nokia 7650 were super-fast, with an ARM9 running at 100 MHz, and cache. But that was more than ten years ago.
The evolution of mobile platforms since then has been spectacular. The first 3D games on phones had very little in common with what we now see on Android devices. One of the triggers of this giant leap was certainly the integration of dedicated graphics hardware into mobile SoCs (System-on-Chip). Along with many other architecture improvements, it powered a huge boost in the triangle throughput capability, from a few hundreds to hundreds of thousands, and an increase of two orders of magnitude in the pixel count. This has more recently allowed developers to finally create console quality games for mobile devices.
Yet, game creators are hungry consumers of resources and have the bad habit of pushing the technology to its limits. That is why many challenges nowadays are very similar to those of the past. In many ways, mobile platforms are almost on par with the current generation of consoles, but they are still way behind modern gaming PCs, and they also have some particularities that one should know about before diving into developing mobile games.
Energy efficiency is still the main constraint that limits the overall processing power of mobile devices, and will continue to be so in the foreseeable future. Memory is also limited—although this has improved enormously in the past few years—and shared with other processes running in the background. Bandwidth is, as always, a very precious resource in a unified architecture and must be used wisely or it could lead to a dramatic drop in performance. In addition, the variety of devices, processing power, display sizes, input methods, and flavors in general is something that mobile developers have to deal with on a daily basis.
Here comes Anarchy!
At Havok we have been trying to make life a bit easier for Android developers by handling most of these challenges ourselves with Project Anarchy.
We have recently announced the release of this toolset made up of Havok’s Vision Engine, Physics, AI, and Animation Studio; components of which have been used to build multiple games like Modern Combat 4, Halo* 4, Skyrim*, Orcs Must Die, and Guild Wars 2 to name a few. Project Anarchy optimizes these technologies for mobile platforms, bundles them together along with exporters for Autodesk’s 3ds Max* and Maya* and a full WYSIWYG editor, and allows users to download a complete toolkit for development on iOS*, Android (ARM and x86), and Tizen*.
Image may be NSFW.
Clik here to view.
Figure 1. "A screenshot of the RPG demo included in Project Anarchy, is an example of content that runs on current Android platforms."
Vision goes mobile
As one would expect, the tool that required the most work to be ported to Android was our 3D game engine. The Vision Engine is a scalable and efficient multi-platform runtime technology, suited for all types of games, and capable of rendering complex scenes at smooth frame rates on PCs and consoles. Now the Vision Engine had to perform at similar standards on mobile platforms. And as important as that, we wanted to provide the same toolset as for any other platform, but streamlined specifically to address the challenges associated with development on mobile platforms.
Having worked with consoles such as Xbox 360*, PlayStation* 3, and PlayStation Vita*, we were already familiar with low memory environments, and we had optimized our engine and libraries for those kinds of constrained environments. But moving to mobile meant having to make other optimizations, and the specifics of mobile platforms required us to think of some new tricks to make things run nicely with limited resources. Several optimizations had to be made to reduce the number of drawcalls, the bandwidth usage, the shader complexity, etc.
A few rendering tricks
For example, additional rendering passes and translucency are expensive. That is why we had to simplify our dynamic lighting techniques. The optimization we used here was to collapse one dynamic light—the one that affects the scene the most and would thus have produced the highest overdraw—into one single pass with the static lights. As there is often one dominant dynamic light source in a scene, this greatly helped performance by reducing drawcall count and bandwidth requirements. In addition, we also offer vertex lighting as a cheap alternative, but pixel lighting will still be required for normal maps.
Vision also supports pre-baked local and global illumination, which is stored in lightmaps (for static geometry) and what we call a lightgrid (used for applying pre-computed lighting contributions to dynamic objects). In a lightgrid, you have a 3D grid laid out in the scene that stores the incoming light from six directions in each cell. On mobile devices, we can optionally use a simpler representation to improve performance. This representation only stores light from one primary direction along with an ambient value. The lighting results do not achieve the same visual fidelity, but they are usually good enough and very fast.
Image may be NSFW.
Clik here to view.
Figure 2. "The difference in the lighting results when using a normal lightgrid versus a simple lightgrid."
As mobile GPUs often have limited resources for complex arithmetic operations, evaluating exponential functions for specular lighting could also become a serious bottleneck in terms of frame rate. To avoid this, we pre-bake cubemaps in our scene editor that accumulate lighting information from all surrounding light sources. While diffuse lighting is computed as usual, we approximate specular highlights by sampling from the generated cubemap and adjusting the intensity to account for local occlusion. This allows us to approximate an arbitrary number of specular highlights at the cost of a single texture lookup, while still getting a very convincing effect.
Shadow mapping was another feature that needed some tweaking. Instead of using a deferred shadow mask as we do on PCs (i.e., performing the depth comparison in a full-screen post-processing pass and then using the resulting texture to modulate the dynamic lighting), we fetch the shadow map directly during the lighting pass to save memory bandwidth. Furthermore, as texture sampling is relatively expensive on mobile devices, we limited our shadow maps to a single sample comparison instead of percentage-closer filtering. As a result, the shadows have hard edges, which is generally acceptable if shadow casting is restricted to a relatively small area. We currently support shadow maps for directional and spot lights, but we chose not to support shadow maps for point lights on mobile platforms for now, as the tetrahedron shadow mapping technique we use on PCs and consoles would be prohibitively expensive. Shadow mapping on mobile is also recommended to be used only in small areas, and to have few objects casting shadows, like the players and maybe a few enemies for example.
We also spent some time in making volumetric effects (volumetric lights, fog volumes, sun shafts) run smoothly on mobile. These techniques typically require rendering multiple transparent passes, performing multiple texture sampling operations per pixel, or computing integrals—each of which is prohibitively expensive on mobiles. As a result, we ended up going down a different route. On mobile platforms, our volumes are actually made of a low-poly mesh consisting of a few layers, like an onion, which a shader will fade out as the camera approaches. The trick here consists of collapsing the geometry to lines as soon as the transparency is so low that you can’t actually see the geometry anymore. These degenerated triangles will not be rasterized and so the pixel fill-rate is significantly decreased and reasonable performance is achieved.
Image may be NSFW.
Clik here to view.
Figure 3. "An example of shadow maps and volumetric effects running on Android*"
Terrains also required some modifications for mobile. On PCs and consoles we use height-field based terrains with dynamic geometry mipmapping, along with runtime texture blending, and three-way mapping to avoid texture stretching on steep slopes. As a result, the vertex counts are relatively high, and the bandwidth requirements resulting from mixing multiple detail textures are substantial. To make Vision terrains work on mobile platforms, we allow generating optimized static meshes from heightmaps and baking down the textures into a single map per terrain sector. As a consequence, we can’t render truly huge worlds with runtime-modifiable terrain, but this limitation is typically acceptable on mobile.
Another convenient feature that we added to Vision to improve performance of pixel-heavy scenes on devices with very high resolution displays is an option for upscaling. This is done by rendering the scene into a low resolution off-screen target and upscaling it to the display resolution in a separate step. On the other hand, to maintain high visual quality, UI elements and text are still rendered at the display full resolution. This works quite well on devices with resolutions higher than 300 dpi, and can yield substantial performance gains.
Shader authoring considering mobile GPU oddities
All our existing shaders in the Vision Engine are written in HLSL. So, the first obvious problem when targeting OpenGL* ES platforms is that shaders require GLSL. To make cross-platform development as easy as possible, we designed a system in which shaders only need to be written once, in HLSL/Cg, and they are automatically translated to GLSL by vForge, our scene editor, when they are compiled.
The second concern when writing shaders for mobile is how different the hardware architecture is from other more traditional platforms. For a start, to save space and power, all mobile SoCs have unified memory. System RAM is shared between the CPU and GPU; it is limited, and typically slower. Therefore, our aim is to avoid touching RAM as much as possible. For example, minimizing the vertex size and the number of texture fetches is generally a good idea.
Another big difference is that most mobile GPUs, such as the PowerVR* GPUs used in Intel® Atom™ systems, use tile-based deferred rendering. The GPU divides the framebuffer into tiles (16x16, 32x32), defers the rendering until the end, and then processes all drawcalls for each tile—one tile fits entirely inside one GPU core. This technique is very efficient because pixel values are computed using on-chip memory, requiring less memory bandwidth and less power than traditional rendering techniques, which is ideal for mobile devices. , An additional benefit of this approach is that, as it just involves comparing some GPU registers, depth and stencil testing is very cheap. Also, as only the resolved data is copied to RAM, there is no bandwidth cost for alpha blending, and MSAA is cheap and uses less memory.
In tile-based architecture, color/depth/stencil buffers are copied from RAM to tile memory at the beginning of the scene (restore) and copied back to RAM at the end of the scene (resolve). These buffers are kept in memory so that their contents can be used again in the future. In many applications, these buffers are cleared at the start of the rendering process. If so, the effort to load or store them is wasted. That is why in Vision we use the EXT_discard_framebuffers extension to discard buffer contents that will not be used in subsequent operations. For the same reason, it is also a good idea to minimize switching between render targets.
We also want to avoid dependent texture reads in the pixel shader, as they make texture prefetching useless. When dependent texture reads are performed by the shader execution units, the thread will be suspended and a new texture fetch task will be issued. To prevent this, we do not do any mathematical operations on texture coordinates in the pixel shader.
Dynamic branching in our shaders is also something that we want to avoid, as it causes a pipeline flush that ruins performance. Our solution for this is a shader provider that will select the particular shader permutation for a specific material depending on its properties and thus avoid branching. Also, to reduce the runtime memory consumption we store these shaders in a compressed format and only decompress them when they are actually needed.
It is also important to take into account the precision used in mathematical operations in shaders, as reducing the precision can substantially improve performance. Therefore, it is recommended to always use the minimum acceptable precision to achieve any particular effect.
Image may be NSFW.
Clik here to view.
Figure 4. "An example of usage of lightweight mobile shaders in Vision: a glowing emissive texture and a specular cubemap that gives a shiny effect to the rocks."
These are just general optimizations that should work on all Android platforms, but keep in mind that every device and every GPU has its oddities. So, a good piece of advice would be to always read the vendor-specific developer guidelines before targeting any platform.
A Lifetime headache
With incoming calls and messages and a thousand different events popping up at the most inappropriate time, application lifetime management on Android devices becomes a serious matter. The operating system can require applications to free up resources, for instance, when another application is launched and requires system resources. Similarly, the operating system can require your application to terminate at any time.
In Vision we handle unloading and restoring graphics resources (textures, GPU buffers, shaders) when the mobile app goes to the background. This is mandatory for Android because all OpenGL ES handles are invalidated as soon as the app goes to the background, but on other platforms it is also generally a good idea to free some memory to reduce the risk of the app being terminated by the operating system due to a low memory situation.
Also on Android, handling the OS events can be a tricky job, because the order in which they happen is not the same for different devices and/or manufacturers. So this requires implementing a robust internal state handler that depends on the exact order of events as little as possible. This means monitoring the running state of an app, checking if it has a window handle, and whether it is focused.
Image may be NSFW.
Clik here to view.
Figure 5. "Application lifetime management on Android devices becomes a serious matter."
Havok Physics, AI, and Animation Studio
The other products included in Project Anarchy—Havok Physics, AI, and Animation Studio—do not have any graphical parts in them. So, when we ported them to mobile, it was purely about CPU and memory optimization.
We already supported Linux*-based systems, so when we started on mobile, and since they have broadly similar compilers and system APIs to Linux environments, getting the code to work was relatively straightforward. The main effort after that was to make them fast. We worked closely with Intel to make sure our code was optimized for Intel® Streaming SIMD Extensions (Intel® SSE). The compiler can make a large difference in some areas of code, and we see on-going increases in performance from newer compiler revisions as the platform SDKs mature.
The second prong of attack was multithreading. Given that most mobile CPUs are now multicore, we took our code, already well optimized for multithreaded environments on PCs and consoles, and thoroughly profiled it on mobile platforms to ensure that it was efficiently multithreaded on our target systems.
Finally, we had to make sure our code stayed cache efficient, given that memory speeds on mobile are relatively low. This is not a problem specific to mobile, so our existing optimizations to reduce cache misses ported over well.
From painful to painless workflow
The development workflow on mobile platforms has always been known to be somehow painful, especially when developing for multiple platforms and having to care about porting assets to different formats to match the requirements on each device (i.e., different texture sizes, file formats, compression methods). On top of this, files are usually required to be bundled together with the application package, which means that for each asset change—textures, sounds, models—the package has to be rebuilt and uploaded to the device. For larger projects the build time of the packages, and the upload and install times, can become prohibitively long and slow down development due to lengthy iteration cycles.
Image may be NSFW.
Clik here to view.
Figure 6. "Screenshot of the RPG demo content in the scene editor vForge during development"
Managing and previewing assets
To make this process easier and faster, we decided to implement a few custom tools. The first one is an asset management system that has an easy to use asset browser integrated with our scene editor vForge. The asset management system provides automatic asset transformation capabilities and can convert textures from their source format (i.e., PNG, TGA) to a platform-specific format (i.e., DDS, PVR, ETC). As a result, developers do not have to think about which texture formats are supported on which platform. The actual conversion is automatically performed in vForge, but developers can also configure each asset individually to allow precise tweaking if needed, or even hook in their own external tool to do custom transformations on any type of asset (i.e., reducing the number of vertices of models).
We also added a material template editor in vForge that allows specifying platform-dependent shader assignments. This makes it possible to have different shaders, optimized for different platforms, configure them once and use them on every material that should use the same configuration.
All scenes can be previewed in vForge using device-specific resources and shaders instead of the source assets, thus allowing the artists to quickly see how the scene will look on the target device without having to deploy it.
Image may be NSFW.
Clik here to view.
Figure 7. "The asset management system includes an easy to use asset browser integrated with the scene editor, with automatic asset transformation capabilities."
The magically mutating assets
The second tool we implemented to enable faster turnaround times is an HTTP-based file serving system that allows an application running on a mobile device to stream in data from a host PC. This is extremely useful during development cycles because—together with the vForge preview—it completely removes the need for re-packaging and re-deploying the application every time an asset is modified.
Behind the scenes, the file server will cache downloaded files on the device and only re-download them when they have changed on the host PC, allowing for very fast iteration times, as only changed scenes, textures, etc. are transferred. In most cases it isn't even necessary to restart the application on the device to update resources, as almost all resource types can be dynamically reloaded inside a running application.
As a side effect, creating and deploying application packages is usually much faster when using this tool, as packages will only have to contain the compiled executable code—even scripts can be transferred over the file server connection. This allows for much faster iteration times, given that executables are typically very small in comparison with the associated scene data.
Handling the input remotely
Another tool we created to shorten turnaround times is what we’ve called “Remote Input.” It is actually a very simple idea, consisting of an HTML5-based web app that forwards inputs from a mobile device to the game running on a PC. Touch input events, as well as device acceleration and orientation data, are simply forwarded from the web browser on your mobile to the PC version of your application, or even to a scene running inside vForge. It can be used to rapidly prototype and test multi-touch input in your game without having to deploy it to a mobile device.
OpenGL ES 3.0 and the future
Some of the limitations in the techniques explained in this article may not be necessary in the near future. As smartphones and tablets get more and more powerful, the restrictions will be relaxed. But game features will advance and continue to push mobile hardware to its limits, as they have been doing for the past fifteen years.
New devices will offer more CPU and GPU cores, making it even more necessary to exploit the wonders of multithreaded computing. Longer term, we will probably get closer in performance and capabilities to current generation PCs, but there will still be some gotchas and caveats to watch out for on mobile, like the limited memory bandwidth.
The new APIs that are right there on your doorstep also offer a broad fan of new, exciting, and challenging possibilities. We already have a few devices out in the wild with cores and drivers fully conformant with OpenGL ES 3.0 (supported from Android 4.3 Jelly Bean). Some of the new features include occlusion queries (already in use on PCs and consoles), transform feedback (enabling features like GPU skinning with very high bone counts), instancing (extremely useful to reduce drawcall count and therefore CPU load), multiple render targets (to facilitate deferred rendering and post-processing effects), a bunch of new texture formats, and many other cool features. On the other hand, we will also be able to start moving some of the computational work over to the GPU thanks to OpenCL*, which is just emerging on mobile. We already have full GPU-driven physics simulations on the PlayStation 4, but this is an open R&D area for us in the mobile arena and will certainly be very exciting to explore.
About the author
Carla is a Developer Relations Engineer at Havok, responsible for helping developers to make better games with the Vision Engine. She has been working in the mobile 3D graphics arena since 2004. She started at RTZ interactive, a small company in Barcelona, developing 3D games for Java and Brew phones. A few years later, she moved over to developing games for the iPhone. Prior to joining Havok, she spent a couple of years at ARM working on the OpenGL ES drivers for the Mali-T600 series of GPUs.
Intel, the Intel logo, and Atom are trademarks of Intel Corporation in the U.S. and/or other countries.
Copyright © 2013 Intel Corporation. All rights reserved.
*Other names and brands may be claimed as the property of others.
OpenCL and the OpenCL logo are trademarks of Apple Inc and are used by permission by Khronos.
Havok™: Anarchy™ Project
Image may be NSFW.
Clik here to view.
Le Projet Anarchy d'Havok est un moteur de jeu mobile gratuit pour iOS, Android (X86 aussi) et Tizen. Il comprend le Vision Engine de Havok avec Havok Physics, Havok Animation Studio et Havok AI. Il dispose d'une architecture C++ extensible, un rendu optimisé pour les mobiles, un système flexible de gestion des assets et le codage et débogage pour Lua.
Plusieurs échantillons complets de jeux sont inclus dans le SDK. Plusieurs tutoriels sont mis en ligne sur le site officiel du Project Anarchy afin de faciliter aux développeurs l'utilisation de ce moteur.
Ce dernier qui vous permet de publier gratuitement vos applications sur differents Systémes d’exploitation est utilisé dans des jeux que vous connaissez forcément: The Elder Scrolls®, Halo®, Assassin’s Creed®, Uncharted™ and Skylanders™.
Ce qu’il faut retenir:
- Architecture C++ extensible basée sur des plugins
- Échantillons de jeux complets avec leurs codes sources
- Large communauté: Discussions sur les forums, Mises à jour, Formations,…
- AUCUNE restriction commerciale, GRATUIT pour publier des jeux pour iOS, Android et Tizen
- Inclus FMOD, la librairie audio utilisée dans les jeux et les applications pour la gestion du son
Pour plus de détails sur les outils fournis, consultez cet article traduit en français sur Développez:
http://jeux.developpez.com/tutoriels/Project-Anarchy/0-tour-d-horizon/
ou sa version originale en anglais sur GameFromScratch:
http://www.gamefromscratch.com/post/2013/07/03/A-closer-look-at-Project-Anarchy.aspx
Petite Vidéo de Démonstration sur Youtube: http://bit.ly/14tw837
Driver USB d'Intel® pour les terminaux Android
Le driver USB d'Intel pour les terminaux Android vous permet de connecter votre machine basee Windows* a votre terminal Android contenant un Processeur Intel Atom.
Remarque: La version 1.1.5 du driver est concue pour les développeurs d'applications pour Android utilisant Microsoft Windows* 8. Pour le support du terminal contactez le fabricant de votre appareil. Vous trouverez ci dessous le lien pour télécharger la version 1.1.4:
Microsoft Windows*
Windows 8 (32/64-bit), Windows 7 (32/64-bit), Windows Vista (32/64-bit), Windows XP (32-bit only)
Cliquer ici pour le Guide d'installation pour Windows
Lien: Description: Taille du Fichier: MD5 Checksum: SHA-1 Checksum: | IntelAndroidDrvSetup1.1.5.zip System Driver 8.8 MB - - |
URL:
Third-Party Android Application Debug Reference on Intel® Processor-based Platforms Part 2
Contents
Debug Third-Party Vendor Applications
Debug Java application
Debug x86 native library of application
Application Usage Pre-condition
Fail to install app
App has hard code dependence on ARM abi/arch property, etc.
App has dependence on OEM framework (like Samsung changes its framework)
App has some dependence on native library which is missing on Intel processor- based platform
App doesn’t have permission
Data base structure difference
App has dependence on ISV function package like <uses-library android:name="xxx_feature" /> in AndroidManifest.xml
com.google.android.vending.licensing.LicenseValidator.verify Issue related with paid app
How to add resources into the apk file
Google Play Store Filter Details
Debug Third-Party Vendor Applications
There are many third-party vendor applications available on the Android market, making them runnable on mobile platform is very important for the success of mobile platform on the market. The problem is that there is no source code available for these third-party vendor applications, and they do not run well on some mobile platforms sometimes, how can we identify the issue?
Debug Java application
There are many tools available for debug android java application, they can help developer to debug android application quickly and easily .
Below is a list of Android debug tools and links for how to download them:
baksmali:Analyze odex / dex file into smali file., You need to put files under /system/framework in the same working directory.
Command: java -jar baksmali.jar -x file.odex
Download link: http://code.google.com/p/smali/downloads/list
smali:Make smali files into classes.dex files.
Command: java -Xmx512M -jar smali.jar out -o classes.dex
keytool: Create apk certificates.
keytool -genkey -v -alias CERT -keyalg RSA -keysize 2048 -validity 10000 -keystore CERT.keystore
jarsigner:apk sign tool
jarsigner -verbose -keystore CERT.keystore to_sign.apk CERT
dex2jar:Decompile classes.dex into jar file
JD-GUI: Check Java source from jar file
apktool:Decompile resource /xml/smali files from apk. It can also compile apk file using smali
Download link: http://code.google.com/p/android-apktool/downloads/list
AXMLPrinter:Convert apk’s xml file into readable file.
Download link: http://code.google.com/p/android4me/downloads/list
zipalign: Optimize apk file size
Command: zipalign -v 4 unaligned.apk aligned.apk
Smali debug
Google provides the apktool to decompile Dalvik dex into smali code (Dalvik byte code).
After using apktool, there will be a smali directory with all smali files for application dex.
For smali opcodes, please refer to this link:
http://pallergabor.uw.hu/androidblog/dalvik_opcodes.html
Smali file general format is as follows:
.class < permission> [decoration word] < class name>
.super < parent class>
.source <source code file name>
For example:
Open MainActivity.smali, the first 3 lines are as following:
.class public Lcom/droider/crackme0502/MainActivity;
.super Landroid/app/Activity;
.source "MainActivity.java"
1st line – this is class name MainActivity. Its package is com.droider.crackme0502
2nd line – MainActivity’s super class is android.app.Activity
3rd line – Source code name is MainActivity.java
The class consists of many fields and methods. In smali, a field is declared as “.field” instruction with the following format:
.field <permission> static [ decoration key word] < field name>:< field type>
For static field in smali, the annotation in smali begins with #
instance fields, and the format is as follows:
# instance fields
.field <permission> static [ decoration key word] < field name>:< field type>
For example:
# instance fields
.field private btnAnno:Landroid/widget/Button;
1st line – annotation by baksmali
2nd line – field btnAnno is android.widget.Button
If there is a method in the class, there will be smali code for method, begin with the “.method” instruction.
There are direct methods and virtual methods.
# direct methods
.method <access permission> [ decoration key word] < prototype>
<.locals>
[.parameter]
[.prologue]
[.line]
<smali code>
.end method
“virtual methods” are similar to direct ones.
Interface begins with “.implements” instruction.
# interfaces
.implements < interface name>
annotation begins with “.annotation”:
# annotations
.annotation [ property] < class name>
[ filed = value]
.end annotation
# instance fields
.field public sayWhat:Ljava/lang/String;
.annotation runtime Lcom/droider/anno/MyAnnoField;
info = "Hello my friend"
.end annotation
.end field
Tips:
With smali code being so difficult to write, how can we write smali code quickly?
The answer is Eclipse. You can create an Android project in Eclipse, then use Java code to write what the code you want, then create apk file with dex. Finally you can get smali code for your function by apktool and paste the smali code in another place for usage.
For example: Log.x API in Java code. You can decompile it into smali, then paste it into application smali code for debugging message print.
invoke-static {v11, v12}, Landroid/util/Log;->e(Ljava/lang/String;Ljava/lang/String;)I
When debugging application programs, you can use one of the following two methods:
Add a log in smali code
Add more registers in the “.local” variable, e.g., v11,v12
Add log smali code in place
const-string v11, "@@@@"
const-string v12, "interceptPowerKeyDown enter"
invoke-static {v11, v12}, Landroid/util/Log;->e(Ljava/lang/String;Ljava/lang/String;)I
if register are v28 and v29, use following code:
invoke-static/range {v28 .. v29}, Landroid/util/Log;->e(Ljava/lang/String;Ljava/lang/String;)I
Print call stack in smali:
Add a register in “.local” variable. e.g., v11
Add print call stack smali code in place
new-instance v11 Ljava/lang/Exception;
invoke-direct {v11, Ljava/lang/Exception;-><init>()V
invoke-virtual {v11, Ljava/lang/Exception;->printStackTrace()V
You need to resolve run-time errors after debug and changing your smali code. To get smali code run-time errors, you can use the following commands:
adb logcat | grep dalvikvm
adb logcat | grep VFY
VFY information will show the smali error file, the routine and error causes. Dalvikvm information will show call stacks, context , etc.
Typical runtime errors:
The variable list isn’t consistent with declaration:
2. The routine call type is incorrect
For example: Use invoke-virtual for public/package routine. Use invoke-director for private routine.
3. apk is not signed properly
4. Use adb logcat | grep mismatch to check which package signature is incorrect
After changing the smali code, you need to package it by running “apktool b.” Typically, there will be some error messages, such as the following:
res/values/styles.xml:166: No resource found that matches the given name '@*android:style/Widget.Button error. You need to change styles.xml 166 line '@*android:style/Widget.Button into @android:style/Widget.Button
Many error lines like :\apktool\apk\res\values\public.xml:3847: error: Public symbol xxxxx The declaration is not defined.
All these errors are actually due to the first error line:
res/values/strings.xml:242: error: Multiple substitutions specified in non-positional format. Did you mean to add the formatted="false" attribute? string.xml
Check line 242 in the strings.xml file to find problem string and fix it.
Function calls (invoke-virtual etc. instructions) only can use v0~v15 register as parameter,there will be errors if using v16, etc. There are two ways to fix this:
Use invoke-virtual/range {p1 .. p1} instruction
Add move-object/from16 v0, v18 instruction
pN is an equal variable number + N. For example, “.local” is declared as 16,You can use register v0~v15. p0 is equal to v16,and p1 is equal to v17.
Jump label conflict
You will get this message if there are two of the same jump labels. For example, cond_11 will make the compile fail. You can change them to different names to resolve the conflict like ABCD_XXXX.
Use no-definition variable
The variable can be used as declared in “.local” instruction. For example, .local 30 shows this routine can only use v0 ~ v29, if use v39, there will be error.
Debug x86 native library of application
For example, let’s say you have an apk with x86 native library libcmplayer_14.so. When the application runs on an Intel processor-based platform, there is a tombstone showing a crash in libcmplayer_14.so. The following paragraphs show how to deduce potential problems by checking the libcmplayer_14.so API call to Intel processor-based platform.
Step1: Use readelf to check the Intel processor-based platform libraries, which will be used by libcmplayer_14.so. This will give you some sense about to which component the issue may be related.
readelf -d libcmplayer_14.so
Dynamic section at offset 0xd8b8 contains 33 entries:
Tag | Type | Name/Value |
---|---|---|
0x00000001 | (NEEDED) | Shared library: [libdl.so] |
0x00000001 | (NEEDED) | Shared library: [liblog.so] |
0x00000001 | (NEEDED) | Shared library: [libz.so] |
0x00000001 | (NEEDED) | Shared library: [libui.so] |
0x00000001 | (NEEDED) | Shared library: [libmedia.so] |
0x00000001 | (NEEDED) | Shared library: [libbinder.so] |
0x00000001 | (NEEDED) | Shared library: [libutils.so] |
0x00000001 | (NEEDED) | Shared library: [libstdc++.so] |
0x00000001 | (NEEDED) | Shared library: [libgui.so] |
0x00000001 | (NEEDED) | Shared library: [libandroid.so] |
0x00000001 | (NEEDED) | Shared library: [libsurfaceflinger_client.so] |
0x00000001 | (NEEDED) | Shared library: [libm.so] |
0x00000001 | (NEEDED) | Shared library: [libc.so] |
0x0000000e | (SONAME) | Library soname: [libcmplayer.so] |
0x00000010 | (SYMBOLIC) | 0x0 |
0x00000019 | (INIT_ARRAY) | 0xe89c |
0x0000001b | (INIT_ARRAYSZ) | 16 (bytes) |
0x0000001a | (FINI_ARRAY) | 0xe8ac |
0x0000001c | (FINI_ARRAYSZ) | 12 (bytes) |
0x00000004 | (HASH) | 0xd4 |
0x00000005 | (STRTAB) | 0x8f0 |
0x00000006 | (SYMTAB) | 0x350 |
0x0000000a | (STRSZ) | 2409 (bytes) |
0x0000000b | (SYMENT) | 16 (bytes) |
0x00000003 | (PLTGOT) | 0xe9f4 |
0x00000002 | (PLTRELSZ) | 496 (bytes) |
0x00000014 | (PLTREL) | REL |
0x00000017 | (JMPREL) | 0x12a4 |
0x00000011 | (REL) | 0x125c |
0x00000012 | (RELSZ) | 72 (bytes) |
0x00000013 | (RELENT) | 8 (bytes) |
0x6ffffffa | (RELCOUNT) | 6 |
0x00000000 | (NULL) | 0x0 |
Step2: Use the objdump tool to find the UND API call used by libcmplayer_14.so, which you can find API name that the libcmplayer_14.so may use and could be potential problem call.
objdump -T libcmplayer_14.so | grep UND
00000000 DF *UND* 00000000 _ZNK7android7RefBase9incStrongEPKv
00000000 DF *UND* 00000000 rewind
00000000 DF *UND* 00000000 pthread_attr_setschedparam
00000000 DF *UND* 00000000 fwrite
00000000 DO *UND* 00000000 __sF
00000000 DF *UND* 00000000 usleep
00000000 DF *UND* 00000000 memcpy
00000000 DF *UND* 00000000 realloc
00000000 DF *UND* 00000000 pthread_mutex_init
00000000 DF *UND* 00000000 pthread_attr_init
00000000 DF *UND* 00000000 strcat
00000000 w DF *UND* 00000000 __deregister_frame_info_bases
00000000 DF *UND* 00000000 _ZN7android7Surface4lockEPNS0_11SurfaceInfoEPNS_6RegionE
00000000 DF *UND* 00000000 __cxa_finalize
00000000 DF *UND* 00000000 sched_get_priority_max
00000000 DF *UND* 00000000 malloc
00000000 DF *UND* 00000000 dlsym
00000000 DF *UND* 00000000 strlen
00000000 DF *UND* 00000000 pthread_mutex_lock
00000000 DF *UND* 00000000 __cxa_atexit
00000000 DF *UND* 00000000 sched_get_priority_min
00000000 DF *UND* 00000000 snprintf
00000000 DF *UND* 00000000 __android_log_print
00000000 DF *UND* 00000000 dlerror
00000000 DF *UND* 00000000 setjmp
00000000 DF *UND* 00000000 pthread_mutex_destroy
00000000 DF *UND* 00000000 fclose
00000000 DF *UND* 00000000 fread
00000000 DF *UND* 00000000 fopen
00000000 DF *UND* 00000000 __stack_chk_fail
00000000 DF *UND* 00000000 time
00000000 DF *UND* 00000000 strtok
00000000 DF *UND* 00000000 pthread_create
00000000 DF *UND* 00000000 _ZNK7android7RefBase9decStrongEPKv
00000000 w DF *UND* 00000000 __register_frame_info_bases
00000000 DF *UND* 00000000 _ZN7android10AudioTrack5startEv
00000000 DF *UND* 00000000 pthread_cond_signal
00000000 DF *UND* 00000000 pthread_mutexattr_init
00000000 DF *UND* 00000000 sscanf
00000000 DF *UND* 00000000 pthread_cond_timedwait
00000000 DF *UND* 00000000 pthread_cond_init
00000000 DF *UND* 00000000 _ZN7android10AudioTrack5writeEPKvj
00000000 DF *UND* 00000000 pthread_attr_setschedpolicy
00000000 DF *UND* 00000000 memset
00000000 DF *UND* 00000000 sprintf
00000000 DF *UND* 00000000 fseek
00000000 DF *UND* 00000000 pthread_mutex_unlock
00000000 DF *UND* 00000000 pthread_cond_destroy
00000000 DF *UND* 00000000 strstr
00000000 DF *UND* 00000000 ftell
00000000 DF *UND* 00000000 free
00000000 DF *UND* 00000000 atoi
00000000 DF *UND* 00000000 strchr
00000000 DF *UND* 00000000 printf
00000000 DF *UND* 00000000 _ZN7android10AudioTrack5pauseEv
00000000 DF *UND* 00000000 pthread_cond_wait
00000000 DF *UND* 00000000 strdup
00000000 DF *UND* 00000000 puts
00000000 DF *UND* 00000000 dlopen
00000000 DF *UND* 00000000 _ZN7android10AudioTrack4stopEv
00000000 DF *UND* 00000000 _ZN7android10AudioTrack5flushEv
00000000 DF *UND* 00000000 strcpy
00000000 DF *UND* 00000000 pthread_join
Step3: Add a log or use another debug method to the libraries and API on the Intel processor-based platform used by libcmplayer_14.so to find the crash position.
GDB check core dump
gdb.sh
Find the core dump file by using the command:
adb shell cat /logs/history_event
#V1.0 CURRENTUPTIME 0001:05:08
#EVENT ID DATE TYPE
REBOOT 8a41fa9bc91dc23cc5a2 1982-01-01/00:04:40 SWUPDATE 0000:00:00
STATE e4fd39f21a9630bf4187 1982-01-01/00:04:40 DECRYPTED
CRASH 4be805a6b721775813d6 2013-02-06/09:16:00 TOMBSTONE /mnt/sdcard/logs/crashlog0
CRASH 00526c3de09fd880c454 2013-02-06/09:16:16 APCOREDUMP /mnt/sdcard/logs/crashlog1
You find the core dump file is in /mnt/sdcard/logs/crashlog1
Get the core dump file and the application’s native x86 library into the host. For example:
adb pull /mnt/sdcard/logs/crashlog1
adb pull /data/app-lib/com.zeptolab.ctr.ads-1/libctr-jni.so
Copy the application’s native x86 library into the project symbol lib path. For example:
cp libctr-jni.so ~/JB/ target/product/blackbay/symbols/system/lib
Use gdb.sh in attachment (you may need to do some path change in the gdb.sh script on your host) to run gdb and analyze the core dump. For example:
cd ~/JB/
gdb.sh app_process out/target/product/blackbay
core ~/temp/crashlog1/1360114650_app_process_5145.core
Now you can use any of the gdb commands to analyze the core dump file. For example, use the “bt” command to get call back trace.
objdump check native library
Same example as above:
Get the code base related to the bug
Build the code base for the bug
cd <aosp>/out/target/product/<platform>/symbols/system/lib. There will be a symbol for lib here
objdump -d libjni_latinime.so > tmp.log to decompile the lib
Open tmp.log to find eip 880b. You can find the error code position for the decompiled routine name in tmp.log
For example, you can use c++filt to turn _ZN8latinimeL30latinime_BinaryDictionary_openEP7_JNIEnvP8_jobjectP8_jstringxxiiii into a readable function name like the following:
c++filt _ZN8latinimeL30latinime_BinaryDictionary_openEP7_JNIEnvP8_jobjectP8_jstringxxiiii
latinime::latinime_BinaryDictionary_open(_JNIEnv*, _jobject*, _jstring*, long long, long long, int, int, int, int)
GDB debug assembly code
Target Device:
gdbserver :port - -attach ID
Host PC:
adb forward tcp:1234 tcp: 1234
gdb.sh app_process <symbol path>
#target remot :1234
b *0x80123432 //break in 0x80123432 address
x/64xw 0x80123432 //show 64 number 4bytes memory content from address 0x80123432 in hex format
x/s 0x80123432 //display a string from address 0x80123432
info register //check all register value
info symbol 0x80123432 //get symbol information from address 0x80123432
nexti //execute next asm instruction
display $eax //add expression value
p expression//show expression value
disassemble _ZN14ProfileManager18WriteTrophiesStateEP7__sFILERKN13PlayerProfile21ProfileActiveTrophiesE
//show asm code of function
using c++filt tool to demangle function name as following:
_ZN14ProfileManager18WriteTrophiesStateEP7__sFILERKN13PlayerProfile21ProfileActiveTrophiesE
ProfileManager::WriteTrophiesState(__sFILE*, PlayerProfile::ProfileActiveTrophies const&)
Apk Install Directory
Since JB, if you donwload paid apps from google playstore and install them. The install path is /data/app-asec and the format is .asec.
If you want to decompile apk, you need to download and install it on ICS platform.
android version | installed path | file format on device |
ICS | /data/app | .apk |
JB | /data/app-asec | .asec |
When installing paid applications in JB, you can try to find the installed package in the /mnt/asec directory.
For example, modern combat 2 would look like this:
/mnt/asec/com.gameloft.android.ANMP.GloftBPHM.ML-1/pkg.apk is the apk installation package.
Downloaded app data may in the following position:(use fina –name * GloftBPHM* to search)
/mnt/shell/emulated/0/Android/data/com.gameloft.android.ANMP.GloftBPHM.ML
/data/media/0/Android/data/com.gameloft.android.ANMP.GloftBPHM.ML
/storage/sdcard0/Android/data/com.gameloft.android.ANMP.GloftBPHM.ML <-/storage The data in this directory is mostly likely created by the application itself when launched.
x86 Debug Case
Antutu tombstone issue [BZ 107342]Run the 3d bench test after DUT encrypted. A tombstone will happen. The x86 emulator also has the problem, so the app should have responsibility for the issue.
Run cat /logs/his* to find the tombstone core dump file. Copy Antutu library 3drating.5 and libabenchmark.so into out/target/platform/redhookbay/symbols/system/lib
Use the “GDB check core dump” method to open the core dump file. Use the bt command to show the call stack as follows:
#0 0x5e361ef5 in native_window_set_buffers_format (format=4, window=0x0) at system/core/include/system/window.h:749
#1 ANativeWindow_setBuffersGeometry (window=0x0, width=0, height=0, format=4) at frameworks/base/native/android/native_window.cpp:63
#2 0x60f4be20 in Ogre::AndroidEGLWindow::_createInternalResources(ANativeWindow*, AConfiguration*) () from /home/zwang/r4_2_stable/out/target/product/redhookbay/symbols/system/lib/3drating.5
#3 0x60f4c4dd in Ogre::AndroidEGLWindow::create(std::string const&, unsigned int, unsigned int, bool, std::map<std::string, std::string, std::less<std::string>, Ogre::STLAllocator<std::pair<std::string const, std::string>, Ogre::CategorisedAllocPolicy<(Ogre::MemoryCategory)0> > > const*) () from /home/zwang/r4_2_stable/out/target/product/redhookbay/symbols/system/lib/3drating.5
#4 0x60f4af8e in Ogre::AndroidEGLSupport::newWindow(std::string const&, unsigned int, unsigned int, bool, std::map<std::string, std::string, std::less<std::string>, Ogre::STLAllocator<std::pair<std::string const, std::string>, Ogre::CategorisedAllocPolicy<(Ogre::MemoryCategory)0> > > const*) () from /home/zwang/r4_2_stable/out/target/product/redhookbay/symbols/system/lib/3drating.5
#5 0x60f428eb in Ogre::GLES2RenderSystem::_createRenderWindow(std::string const&, unsigned int, unsigned int, bool, std::map<std::string, std::string, std::less<std::string>, Ogre::STLAllocator<std::pair<std::string const, std::string>, Ogre::CategorisedAllocPolicy<(Ogre::MemoryCategory)0> > > const*) () from /home/zwang/r4_2_stable/out/target/product/redhookbay/symbols/system/lib/3drating.5
#6 0x61095091 in Ogre::Root::createRenderWindow(std::string const&, unsigned int, unsigned int, bool, std::map<std::string, std::string, std::less<std::string>, Ogre::STLAllocator<std::pair<std::string const, std::string>, Ogre::CategorisedAllocPolicy<(Ogre::MemoryCategory)0> > > const*) () from /home/zwang/r4_2_stable/out/target/product/redhookbay/symbols/system/lib/3drating.5
#7 0x60ecaf05 in OgreAndroidBaseFramework::initRenderWindow(unsigned int, unsigned int, unsigned int) () from /home/zwang/r4_2_stable/out/target/product/redhookbay/symbols/system/lib/3drating.5
#8 0x60ec2b71 in ogre3d_initWindow () from /home/zwang/r4_2_stable/out/target/product/redhookbay/symbols/system/lib/3drating.5
#9 0x5f09e271 in Java_com_antutu_ABenchMark_Test3D_OgreActivity_initWindow () from /home/zwang/r4_2_stable/out/target/product/redhookbay/symbols/system/lib/libabenchmark.so
#10 0x40dce170 in dvmPlatformInvoke () at dalvik/vm/arch/x86/Call386ABI.S:128
#11 0x40e27a68 in dvmCallJNIMethod (args=0x57941df8, pResult=0x8000cf10, method=0x57c11e10, self=0x8000cf00) at dalvik/vm/Jni.cpp:1174
#12 0x40df197b in dvmCheckCallJNIMethod (args=0x57941df8, pResult=0x8000cf10, method=0x57c11e10, self=0x8000cf00) at dalvik/vm/CheckJni.cpp:145
#13 0x40e2da5d in dvmResolveNativeMethod (args=0x57941df8, pResult=0x8000cf10, method=0x57c11e10, self=0x8000cf00) at dalvik/vm/Native.cpp:135
#14 0x40f2ec8d in common_invokeMethodNoRange () from /home/zwang/r4_2_stable/out/target/product/redhookbay/symbols/system/lib/libdvm.so
#15 0x57941df8 in ?? ()
#16 0x40de1626 in dvmMterpStd (self=0x8000cf00) at dalvik/vm/mterp/Mterp.cpp:105
#17 0x40ddefc4 in dvmInterpret (self=0x8000cf00, method=0x579e0b68, pResult=0xbffff604) at dalvik/vm/interp/Interp.cpp:1954
#18 0x40e590ec in dvmInvokeMethod (obj=0x0, method=0x579e0b68, argList=0x4207c260, params=0x4207c170, returnType=0x417e42d0, noAccessCheck=false) at dalvik/vm/interp/Stack.cpp:737
#19 0x40e6cf67 in Dalvik_java_lang_reflect_Method_invokeNative (args=0x57941f00, pResult=0x8000cf10) at dalvik/vm/native/java_lang_reflect_Method.cpp:101
#20 0x40f2ec8d in common_invokeMethodNoRange () from /home/zwang/r4_2_stable/out/target/product/redhookbay/symbols/system/lib/libdvm.so
#21 0x57941f00 in ?? ()
#22 0x40de1626 in dvmMterpStd (self=0x8000cf00) at dalvik/vm/mterp/Mterp.cpp:105
#23 0x40ddefc4 in dvmInterpret (self=0x8000cf00, method=0x579d63c0, pResult=0xbffff8c8) at dalvik/vm/interp/Interp.cpp:1954
#24 0x40e57e1c in dvmCallMethodV (self=0x8000cf00, method=0x579d63c0, obj=0x0, fromJni=true, pResult=0xbffff8c8, args=<optimized out>) at dalvik/vm/interp/Stack.cpp:526
#25 0x40e1ba6e in CallStaticVoidMethodV (env=0x8000a020, jclazz=0x1d400015, methodID=0x579d63c0, args=0xbffff97c "\t") at dalvik/vm/Jni.cpp:2111
#26 0x40dfb440 in Check_CallStaticVoidMethodV (env=0x8000a020, clazz=0x1d400015, methodID=0x579d63c0, args=0xbffff97c "\t") at dalvik/vm/CheckJni.cpp:1679
#27 0x402685ba in _JNIEnv::CallStaticVoidMethod (this=0x8000a020, clazz=0x1d400015, methodID=0x579d63c0) at libnativehelper/include/nativehelper/jni.h:793
#28 0x40269e71 in android::AndroidRuntime::start (this=0xbffffa50, className=0x80001208 "com.android.internal.os.ZygoteInit", options=<optimized out>) at frameworks/base/core/jni/AndroidRuntime.cpp:1005
#29 0x80000fd0 in main (argc=4, argv=0xbffffaf8) at frameworks/base/cmds/app_process/app_main.cpp:190
Debug from framework API, which is found:
android_view_Surface_getNativeWindow: get surface = 0x802fbbc8 return a valid ANativeWindow*
ANativeWindow_setBuffersGeometry. The input parameter is ANativeWindow* = null
So there should be something happening between the above two APIs that result in the tombstone happening.
Antutu uses ORGE render engine in the call stack. We can find the ORGE render engine source code at this link:
http://code.metager.de/source/xref/ogre/RenderSystems/GLES/src/EGL/Android/OgreAndroidEGLWindow.cpp
It is found ANativeWindow* point value is changed in void AndroidEGLWindow::create as following. This results in the mWindow being a NULL value and a tombstone happening.
void AndroidEGLWindow::create(const String& name, uint width, uint height, bool fullScreen, const NameValuePairList *miscParams) { ... mWindow = (ANativeWindow*)(Ogre::StringConverter::parseInt(opt->second)); ... _createInternalResources(mWindow, config); ... }
On Intel processor-based platforms, memory points like ANativeWindow* will be higher than 0x80000000, (ARM platforms will lower than 0x80000000 ), which results in unsigned int -> int -> string -> int change problem. This problem won’t happen on ARM platforms.
In Antutu issue, the ANativeWindow* stored in opt->second should be used as unsigned int, parseInte will result to return 0 and Antutu tombstone issue.
App Issues—Highlights
Application Usage Pre-condition
Error Symptom:
When many applications appear to fail, they haven’t really failed. It is due to some usage pre-condition that is ignored by the tester or end user and later causes failures. Some typical pre-conditions are as follows:
sim card location difference (sim cards in the U.S., China, France have different 3G locations)
Wi-Fi*/3G Internet connection differences. Some apps have specific locations or requirements for Wi-Fi/3G connection.
Poor Wi-Fi/3G connection environment. Poor connections will result in some apps re-trying the Internet connection many times, which results in poor power consumption and crashes.
Screen resolution/dpi constriction. Many applications have requirements for on screen resolution/dpi to work.
GMS services. Google play service, Google service framework, etc. turn off. Many GMS apps depend on underlying GMS services to work.
Solution:
Make sure you use apps that meet all of the above pre-conditions.
Fail to install app
Error Symptom:
Applications downloaded from Play store fail to install to device, or fail to install apk by adb install.
Solution:
Make sure usb/sd card write is ok. Make sure houdini hook in PMS works well.
App has hard code dependence on ARM abi/arch property, etc.
Error Symptom:
Dalvik cannot resolve the link for the ARM library or cannot find the native method implementation because it fails to copy the native library into the device when app is run.
Major app function check fail, etc.
Solution:
Remove app hard code check by changing its smali code or ask app ISV to fix.
App has dependence on OEM framework (like Samsung changes its framework)
Error Symptom:
Fail to find some field, method, or class.
Solution:
Change smali code to mask related field/method/class usage.
Copy related framework class into your device.
App has some dependence on native library which is missing on Intel processor- based platform
Error Symptom:
UnSatisfiedException exception with missed native library name.
Solution:
Check app library dependence and copy related library into device.
App doesn’t have permission
Error Symptom:
no permission
Solution:
Add <user-permision /> into AndroidMenifest.xml
Data base structure difference
Error Symptom:
Miss some field or type mis-match
Solution:
Decompile apk, check smali code to find SQL-related string (ex: create table … ), edit SQL string to fit data base, and package back into apk.
App has dependence on ISV function package like <uses-library android:name="xxx_feature" /> in AndroidManifest.xml
Error Symptom:
no-permission to access some feature when launch the app
Solution:
Change AndroidManifest.xml to remove these <uses-library> tage or copy these feature jar package from other mobile platform into your target device to have a try.
com.google.android.vending.licensing.LicenseValidator.verify Issue related with paid app
If you see this issue, you can repo it using the following steps:
Error Symptom:
Issue log as following:
E/AndroidRuntime(27088): FATAL EXCEPTION: background thread
E/AndroidRuntime(27088): java.lang.NullPointerException
E/AndroidRuntime(27088): at com.google.android.vending.licensing.LicenseValidator.verify(LicenseValidator.java:99)
E/AndroidRuntime(27088): at com.google.android.vending.licensing.LicenseChecker$ResultListener$2.run(LicenseChecker.java:228)
Repo Steps:
(a) Install apk
(b) Remove Google account on the device
(c) Launch this app
How to add resources into the apk file
Example: to add one string resource, we need to add into values/strings.xml
<string name="newstring">content</string>
public.xml under values directory record all resource ids. Find the latest <public type="string" ...> element, then add <public type="string" name="newstring" id="0x7f0700a0" />
Change smali file to use the new string resource as follows:
invoke-virtual {p0}, Lcom/sini/SfsdfsActivity;->getResources()Landroid/content/res/Resources;
move-result-object v0
const v1, 0x7f0700a0
invoke-virtual {v0, v1}, Landroid/content/res/Resources;->getString(I)Ljava/lang/String;
Google Play Store Filter Details
Google filter link: http://developer.android.com/google/play/filters.html
Device DPI Filter
Google standard DPI: 320, 240 (HDPI), 160 (MDPI), and possibly 120 (LDPI)
If your device is another DPI, many of the applications in the Play store will be unavailable.
Device Feature Filter
Google Play Store will call the API PackageManager.getSystemAvailableFeatures and hasSystemFeature to get all the features on the device.
It will filter applications according to the features available on device.
Take the example of MFLD PR phone JB 4.1
If camera-related features are not available in the return value in API PackageManager.getSystemAvailableFeatures and hasSystemFeature, MFLD PR phone won’t find needed camera features like: smart compass, wechat, etc.
Image may be NSFW.
Clik here to view.If camera-related features are available, in the return value in API PackageManager.getSystemAvailableFeatures and hasSystemFeature, MFLD PR phone can find needed camera features like: smart compass, wechat, etc.
Notes:
You need to clear data and cache of Google Play Service and Google Play Store to make the feature change take effect on the target device.
Image may be NSFW.
Clik here to view.Image may be NSFW.
Clik here to view.Features available on MFLD PR phone are as follows:
root@android:/ # pm list features
feature:reqGlEsVersion=0x20000
feature:android.hardware.bluetooth
feature:android.hardware.camera
feature:android.hardware.camera.autofocus
feature:android.hardware.camera.flash
feature:android.hardware.camera.front
feature:android.hardware.faketouch
feature:android.hardware.location
feature:android.hardware.location.gps
feature:android.hardware.location.network
feature:android.hardware.microphone
feature:android.hardware.nfc
feature:android.hardware.screen.landscape
feature:android.hardware.screen.portrait
feature:android.hardware.sensor.accelerometer
feature:android.hardware.sensor.barometer
feature:android.hardware.sensor.compass
feature:android.hardware.sensor.gyroscope
feature:android.hardware.sensor.light
feature:android.hardware.sensor.proximity
feature:android.hardware.telephony
feature:android.hardware.telephony.gsm
feature:android.hardware.touchscreen
feature:android.hardware.touchscreen.multitouch
feature:android.hardware.touchscreen.multitouch.distinct
feature:android.hardware.touchscreen.multitouch.jazzhand
feature:android.hardware.usb.accessory
feature:android.hardware.usb.host
feature:android.hardware.wifi
feature:android.software.live_wallpaper
feature:android.software.sip
feature:android.software.sip.voip