Is Using Kotlin Today a Risky Move?

Recently, a blog post entitled “The Case Against Kotlin” was published on Medium. The title could have been more descriptive, and I appreciate that it doesn’t tout Kotlin as rainbows and unicorns, but it did bring up a few points that I’d like to address.
android_kotlin

The Learning Curve

Whenever I hear about learning curve as a consideration for using new technologies, I approach the argument with a bit of skepticism. As technologists we have to continue to learn and evolve in order to stay relevant. It’s the nature of the field, which is still innovating and changing at a rapid pace. It can, indeed, take time for teams to come up to speed on a new technology, but ultimately it’s critical for most people if they wish to have a long fulfilling tech career. Having spent some time in stodgy enterprise Java corporate environments, I recognize that the learning curve is often touted as a reason for not wanting to adopt new tools and languages.

In one egregious example I recall an enterprise architect’s recommendation for a team to use GWT instead of a more modern Javascript framework for a web project. The main reason cited was not because GWT offered better support for the product or provided a better way to implement product features; it was because they felt it was unrealistic to have the team learn a new toolset and language. This type of culture is unfortunately all too common in the enterprise Java community. The QT framework takes this even further. With QT, you can use C++ to create applications for iOS and Android, but you end up writing more lines of code. Maybe that feels safer for a C++ developer, but the reality is that most developers will need to learn a number of different programming languages throughout their career.

20151226_224853_557 (1)Even if they revert back to a language they’ve used before, it offers them new perspectives on how they write code. For example, I spent a number of years in enterprise Java before taking a detour into Ruby. When I started doing Android development, I wrote Java very differently from the way I did before I switched to Ruby. Then, once I discovered Kotlin, it was a no-brainer to change languages again, because I love Android but hate Java! Eventually, after doing some projects using Swift and Typescript, the Kotlin experience made these languages easier to learn. To be fair, if we were, in fact, talking about learning something like Haskell, the learning curve may have been much steeper, but in this instance, we’re talking about a language that was designed with Java developers in mind.

Build Time

This is a current problem with toolsets. In the short-term, you can throw hardware at the problem (if you haven’t already), but in the long-term, as Ryan stated, it will be optimized. It’s simply part of using a newer language.

Development Stability

Again, this is something that you always have issues with when using you adopt a newer tool. For that matter, from time to time, I see stability issues even with established tools. If you’re new to the language, some issues may not be readily apparent and could send you down a few rabbit holes. As more developers use the tool, more articles will be published, and tools will improve.

Static Analysis

While there is some improvement that needs to be made to this in Android Studio, it does highlight a larger issue with tools, and Java, in general. After I had spent a number of years doing enterprise Java development, I found myself getting very dependent on development tools. Many of my co-workers felt the same way. While the tools are great, I’ve seen them make mistakes, even with Java. When I switched to doing development in a dynamic language, I was forced to break away from that dependency, because some static analysis that is possible in languages like Java are just not possible in Ruby and Javascript. I’ve also seen Android Studio do some strange things when it automatically makes changes based on static analysis of Java code.

Better analysis is definitely possible in Kotlin, but even without that, you end up writing less lines of code. Less lines of code means less lines of code to review. Do automated tools make things easier on a big team? Absolutely! But, if humans have less code to parse, it’s probably not making an enormous difference.

Reversibility

Sr02Kotlin is one of the few languages I’ve seen with IDE integration which will automatically convert another language to it in a fairly seamless manner. If you’ve ever done any iOS development, you’ll know that there is no automatic code converter built into Xcode that will convert your Objective-C code to Swift. If you convert some code and don’t modify it, you can always use source control to get the original version back. That said, it’s best to understand the limitation and what you’re getting into.

Caveat Emptor

Why are you using the tool?

I could spend some time advocating for using newer tools, but you first need to understand why you are using them! I’ve seen teams cargo cult new tools and technologies without understanding why they are using them. Enterprise Java earned this reputation which is how FizzBuzzEnterpriseEdition was created. Though this project is, of course, a joke, anybody who’s worked on enterprise software has seen this. Some parts of Android were arguably influenced by the enterprise mindset. The long-time Google-recommended practice of using content providers for local database storage is one of those concepts that they are now moving away from.

Kotlin is going to be with us for a while

Although Kotlin doesn’t have as much traction as some other languages, the signs all point to it gaining popularity in the mobile space. Whether it gains popularity on the server side or in other domains is an open question, but Android development is driving its adoption. Why? First, it has first class support in Android Studio, thanks to JetBrains. Second, with its new features and the reduced code footprint, it does give you long-term productivity gains (each line of code is a liability to a project.) Third, even before it was officially endorsed by Google, it was really the only reliable alternative to Java for native development. Finally, it is officially supported by Google.

Having done native iOS development in Swift as well, my biggest complaint about Kotlin is that my Android programs are still larger than their iOS Swift counterparts. Some of that is due to the design of the Android framework, but it’s also because the language is not as big of a step forward as Swift was from Objective-C.

Conservative development is detrimental in native development

Are there times when it makes sense to be conservative with your platform? Absolutely! But, mobile is changing rapidly. Each year, a new version of Android is released. If your apps are being run by consumers, that means that each year more users will be upgrading to new versions of the OS, with new features that need to be implemented. Tools are coming out that make use more productive and we are refining how we organize our applications.

Blaze the trail, but be ready for a few bumps

20160513_162448_380Before you use any new tool, understand why you are using it. For most new applications, I tend to use a MVVM architecture with data binding, Java RX etc. While there might be a good reason to go with MVP, most projects that start this way today have more to do with developers not wanting to learn the tools around MVVM than with MVP actually being a better choice.

Sometimes there are instances where an older architecture/technology/tool/language might be more appropriate for technical reasons. For example, when people need to develop native platform code that is optimized on Android, they use C++, which makes sense. But most applications don’t need this kind of optimization in the same way that most Android applications really don’t need extra Java-based optimizations. Things will certainly be a bit rougher around the edges with a newer technology. At times it might even be a bit of a white knuckle experience!

As soon as you write a line of code, it becomes legacy code. The older the tools you use to create your code today, the older the code will look when someone has to maintain it in six months. If it’s majorly unstable, or there is some edge case that the technology is not covering today that is critical to your applications, then wait. But, if there are a workarounds, these growing pains are investments in the future. When Swift first came out, a lot of Objective-C developers scoffed their noses at the new language. Now Swift is far more popular than Objective-C. The same thing was true with Java and C++.

In conclusion: Are there some few risks associated with blazing a trail with Kotlin? Of course! But what happens if you decide to arrive later to the party? As an organization, it means that you’ll be playing catch-up with your team, and an individual developer, it also means that you’ll be playing catch-up with your skillset. Legacy technology can be just as much of a liability, and Kotlin, compared to other technology changes I have seen, is a pretty safe bet and has a much lower barrier to entry.

Posted in Android, Kotlin, Mobile Tagged with: , ,

Android Things with Raspberry Pi: Workshop Basics Part 5 (Small Displays with the ssd1306)

ANDROID-THINGSssdsingle 128x64-oled-led-display-module-for-arduino-ssd1306-0-96-i2c-iic-blue-yellow-b398127fb9599b54119d7cbadb8d1579 ID931_LRG-half sizeIn our last post we looked at using the temperature sensor, but what if we want to display that data? We could display it on a attached hi resolution screen, but, what if we are using a SOM that doesn’t have display hardware built in? Also, what if we only need a small display and want to use a low cost option? You could use a tm1637 segment display if you only have a small amount of data to display, but what if you want more versatility? That is where the ssd1306 comes in.

The SSD1306

Most ssd1306 displays come with OLED displays that are either 128×64 dots in resolution or 128×32. The displays are designed for monochromatic operation which means that each dot can either be on or off. You can adjust the brightness of the entire display, but not individual pixels. Some displays will give you a multicolor effect by having one section of the display set to one color,  such as yellow, and the other one set to another, such as blue. Depending on the display and vendor you can purchase these for as little as $3.

Getting started

The Android Things drivers-samples repository has good information for getting started (follow the link). While these examples can get you started they don’t give you a lot of information on how to display text. If you’ve worked with raw canvas graphics before it will be really easy to translate that knowledge. Many Android developers don’t need to do that, so for those who haven’t here is a quick rundown. First, for this display you are going to need a bitmap to paint into. Here is an example to create one for a 128×64 display

val mBitmap = Bitmap.createBitmap(128, 64, conf)

next you are going to need to create a canvas object

val canvas = Canvas(mBitmap)

now, create a paint object specifying the color(white), style(fill), and a fontsize.

val paint = TextPaint()
paint.color = Color.WHITE
paint.style = Paint.Style.FILL
paint.textSize = fontSize

use a DynamicLayout to paint the text on the screen. We are letting it take care of the text centering etc., but it does allow for position adjustments (see the documentation).

val dynamicLayout = DynamicLayout(text,
paint, 128, Layout.Alignment.ALIGN_CENTER, 1f, 0f, false)
dynamicLayout.draw(canvas)

finally we will paint this on our screen

BitmapHelper.setBmpData(mScreen, 0, 0, mBitmap, false)
        mScreen.show()

To see a full example that prints out text and graphics head over to our example on Github here. These examples have been designed for the 128×64 displays which tend to be more common. When you run it you should see the following.

ezgif.com-optimize (3)We also designed them to work with one of the two color displays.  In the photo below we have a display with 48 rows of blue pixels and 16 of yellow.

ezgif.com-optimize (2)

The contrast feature was not initially in the driver, so we added the functionality into the driver. At the time of this writing it is still under Google code review. If you would like to use it before it is pulled into the main driver clone this repo in the directory that hosts this example (the directory should have contrib-drivers and ssd1306_example in it), build the driver and reference the .aar file (see the comments in the project). The contrast can be set anytime you have initialized the display.  Valid values are between 0 (for the lowest contrast level) to 255 (for the highest).

mScreen.setContrast(255)

Stay tuned for our next post as we explore more of the Android Things Platform.

Posted in Android, Android Things, Kotlin Tagged with: , ,

Android Things with Raspberry Pi: Workshop Basics Part 4 (Measuring Temperature and Pressure)

ANDROID-THINGSContinuing our Android Things series, let’s learn how to use a sensor. If you look at the driver samples you’ll see samples for the BMX280. The BMX280 is a temperature/pressure sensor that is manufactured by Bosch. It consists of two models. One is the BME280 that can measure temperature, pressure and relative humidity. There is also another, called the BMP280, which only measures temperature and pressure. The current Android Things kits use the BMP280. If you look at the samples, they reference two sensor boards that contain this sensor. One is the Rainbow Hat For Android Things which costs about $25, and the other is the AdaFruit BMP I2C or SPI sensor for $10.

Less expensive options

bmp280While these sensors work well, they are pretty expensive. To be fair, the AdaFruit sensor has great documentation and a voltage regulator that allows it to work with either 5 V or 3.3 V circuits, and they’re guaranteed first run sensors. With a Pi, we only need to support 3.3 volts, which is the operating voltage for the bare sensor. Luckily, there are less expensive options available on Amazon for ~$6 and Ebay for ~$1. Warning: The cheap $1 ones tend to have a high DOA rate and may not be accurate. The quality of any sensors under $4 is somewhat suspect, because they’re priced below the actual cost of the components.  The hookup and use is a bit different from the documentation provided with the samples, so let’s learn how to use these generic boards.

 

Hooking it up

bmp280_bb

Like the AdaFruit board, this one still has 4 wires.The main difference is that we connect it to a 3.3 V-lead instead of a 5 V one. The diagram shows the correct wiring. Some boards do not have the address select hooked up. If that’s the case, you’ll need to hook up the SDO-labeled connector to a ground (for address 0x76) or 3.3 V (for address 0x77).

I2C addresses

I2C allows multiple devices to be on one set of wires. Each device has an address that needs to be unique. Based on that, you can access the various devices to send them commands and read values from them. For more information on I2C, check out this article from SparkFun. If you read the data sheet, you’ll see that the BMP280 has a pin that is either pulled down to ground or up to the line voltage to assign an address of 0x76, or 0x77, respectively.

The Android Things contrib-drivers defaults to address 0x77, which happens to be the default address for the AdaFruit sensor and hat. The default address for the generic board is 0x76.

Modifying the samples to use 0x76

The samples do not show you how to specify the address, so here’s what you need to do.

First,  you need to make sure that you have version 0.3 or higher of the BMX280 driver. It is specified in the BMX280 gradle file.


dependencies {
compile 'com.google.android.things.contrib:driver-bmx280:0.3'
provided 'com.google.android.things:androidthings:0.4-devpreview'
}

Next, in the onCreate method, you need to replace single parameter call to the BmxSensorDriver constructor with a dual parameter one that specifies the address of the sensor.


try {
mTemperatureSensorDriver = new Bmx280SensorDriver(BoardDefaults.getI2CPort(), 0x76);
mTemperatureSensorDriver.registerTemperatureSensor();
} catch (IOException e) {
Log.e(TAG, "Error configuring sensor", e);
}

Now, provided that everything is hooked up correctly, the app should run and you should see something like this in your logcat:


-20 04:53:28.315 12334-12334/com.example.androidthings.driversamples I/TemperatureActivity: sensor changed: 29.394226
07-20 04:53:28.358 12334-12334/com.example.androidthings.driversamples I/TemperatureActivity: sensor changed: 29.389133
07-20 04:53:28.526 12334-12334/com.example.androidthings.driversamples I/TemperatureActivity: sensor changed: 29.378952
07-20 04:53:28.574 12334-12334/com.example.androidthings.driversamples I/TemperatureActivity: sensor changed: 29.373861
07-20 04:53:28.661 12334-12334/com.example.androidthings.driversamples I/TemperatureActivity: sensor changed: 29.378952
07-20 04:53:28.704 12334-12334/com.example.androidthings.driversamples I/TemperatureActivity: sensor changed: 29.368769
07-20 04:53:28.748 12334-12334/com.example.androidthings.driversamples I/TemperatureActivity: sensor changed: 29.35349
07-20 04:53:28.878 12334-12334/com.example.androidthings.driversamples I/TemperatureActivity: sensor changed: 29.358585
07-20 04:53:28.921 12334-12334/com.example.androidthings.driversamples I/TemperatureActivity: sensor changed: 29.35349
07-20 04:53:29.046 12334-12334/com.example.androidthings.driversamples I/TemperatureActivity: sensor changed: 29.343311
07-20 04:53:29.051 12334-12334/com.example.androidthings.driversamples I/TemperatureActivity: sensor changed: 29.3484
07-20 04:53:29.094 12334-12334/com.example.androidthings.driversamples I/TemperatureActivity: sensor changed: 29.343311

Pressure Readings

If you want to take pressure readings you will also need have temperature readings and the callbacks for it enabled for it to work. This is a current bug with the drivers.

Stay tuned for our next post as we continue to delver deeper into Android Things!

Posted in Android, Android Things Tagged with: , ,

Android Things with Raspberry Pi: Workshop Basics Part 3 (Common Issues)

ANDROID-THINGSIn our previous posts, we learned the basics and added a little bling to Hello World. However, when you’re working with hardware, you can run into problems you might not be expecting when working on software. Here are some things we’ve learned along the way, as well as some steps to follow for figuring out what’s wrong.

I have wires that are getting REALLY warm!

While it might help to heat up the room in the wintertime, overheating in wires is, generally speaking, not a good thing, and it means that you have a short circuit somewhere. Disconnect the power and all peripheral connections immediately! A short circuit can damage a computer or monitor (depending on where it is) and it can also damage the Pi. During our very first workshop, we didn’t cover breadboard basics, which led to some students hooking up a circuits using the wrong orientation. Amazingly, it didn’t damage any of the Pis, but did cause some issues that needed to be fixed.

I’m not able to connect to my Pi via the TTY cable

20151226_224853_557 (1)After the students shorted out their units, this was the first symptom they experienced.

  1. If it happens to you, the first thing you should do is to disconnect any peripherals, hook up a monitor via the HDMI port, and power up the unit.  If the screen boots into Android Things and shows your default activity or a screen with an IP address, there is likely a problem with your TTY cable or terminal settings.
  2. If the unit doesn’t boot up, there are two other common problems. One could be a bad power supply. If you have another USB power supply, you can try that. If you don’t, another way to determine if the supply is bad is to try to connect your current power supply up to another USB-based device.
  3. If both of the above are fine, the short may have caused corruption on the file system. Re-flashing the Android Things image will solve that problem.

When this mishap took place, one student ended up with a non-functional power supply that need to be swapped out and needed to re-flash the image. The other students needed to re-flash and were back up and running.

Other good practices

  • Never hook things up while the Pi has power. You can accidentally short-circuit something which can damage your power supply, Pi, or even your computer.
  • Make sure that your circuits are hooked up correctly, and verify they are correct before powering them on. If not, you can end up applying voltage to a component that is not designed to have that voltage applied, or cause a short circuit that can destroy things.
  • If you’re using a breadboard, make sure that you’re using it in the correct orientation that won’t cause shorts. (Also, if you’re hosting a workshop, don’t forget to explain how they work to people who may be new to electronics!)

Stay tuned for our next post!

Posted in Android, Android Things Tagged with: , ,

Android Things with Raspberry Pi: Workshop Basics Part 2 (The Internet of Bling with the APA 102)

ANDROID-THINGSIn the previous post, we talked about the basics of getting Android Things running on a device and connecting to it on a Pi. Now that it’s working, let’s do the traditional “Hello World” with some lights. We could use a standard LED, but that’s pretty plain vanilla, so, instead, we’re going to use APA 102 LEDs. Your first question might be: “Why the APA 102?”14015-03

  1. Each LED actually consists of three individual LEDs, red, green and blue, which can be used to created a large number of colors.
  2. They are controlled via the SPI protocol, which allows you to use 4 wires to control hundreds of LEDs.
  3. At the time of writing, it is the only RGB LED of this type supported by Android Things.

Electricity Basics

LEDs, especially in multiples, can easily consume a reasonable amount of electricity, so now might be a good time to review some basics. USB ports run on 5V DC. If you have multiple USB-based chargers, however, you might notice that different ones have different associated wattages/amp ratings. For the sake of this discussion, we’re going to talk about things in terms of Watts, which can always be calculated if you have an Amps value (by multiplying voltage by amps.) Most pre-USB-C smartphones come with chargers rated at 1-1.5 Amps. Tablets such as the iPad are equipped with chargers rated at 2.4 Amps.

The Raspberry Pi 3 is a fairly power-hungry device for which the manufacturer recommends 2.5 Amp USB power supply (12.5 Watts). Approximately 1.5 Amp is budgeted to run all of the hardware on the board when it is running at full load, and the rest is for providing power to peripherals. For example, to power a USB keyboard or microphone, some of the power is used from the main line. Under-powering the device can cause issues, and cause your Pi to behave strangely. The Raspberry Pi has a 5V pin that is directly hooked up to the power supply.

Hooking Up a Small Strip

The APA102s run off of 5 volts. Each APA102 LED consumes 0.3 Watts of power. So, if you have a 2.5 Amp power supply with no peripherals plugged in, you could, theoretically, run up to 16 of them directly off the board. However, given that many power supplies don’t fully deliver the power they advertise, and because you may want to plug in other peripherals on your board, we recommend maxing out at no more than 7 to be safe.

The easiest form factor to work with are the ones that come in a strip which can be purchased at places like Amazon.  If you want to avoid having to do any soldering, you can either cut a small strip on the end with a connector or use a male hookup wire to pierce through the small holes in between the LEDs (assuming you bought a strip that has these small holes.)

Caution: Connecting the leads while the unit is powered up could lead to a accidental short, which could damage the power supply and SD card of the unit itself. Always connect wires while the unit is powered down!
apa102 Android Things
Use the diagram above to hook up your LEDs. One thing to understand is that the signal wires are hooked up in series and can only pass signals in one direction. To that end, you will see terminals labelled DI, CI, DO and CO. In these abbreviations, the I stands for input and the O stands for output. Thus, if you had a small strip of lights and hooked your wires up in the middle of the strip, half of the lights would receive signals and work, while the other half would not. You therefore want to make sure that your control wires are always hooked up to the connectors labelled with an I.

Running Your First “Internet Of Bling” App

An Android Things app is, at its core, the same thing as an Android application. The main difference is that some services were removed, support is offered for accessing device pins, and you may, or may not, have a display. You can learn all about those differences here. The APA-102 driver is part of an contrib-drivers open source repo. There is also a  examples repo that contains examples for using these drivers. Of note:  At the time of this writing, the Pi diagram on the APA102 examples-repo shows the connections going to the wrong pins on the strip. (We have submitted a pull request that will fix this.)

Once you’ve powered up the device and connected (via “adb connect <ip-address>”), run the APA102 module from Android Studio and deploy it to your connected Pi. If everything is connected correctly and you app compiles and deploys, you should see something like this: ezgif.com-optimize

If the lights don’t light up, but everything looks good in your Android Studio Logcat, verify that you have connected things to the correct pins and that you don’t have a bad wire.

7 (or 16) LEDs are great, but I want MORE LEDs!

As you might recall from our earlier discussion, it is possible to control a lot more than 7 APA102 LEDs from an Android Things device. There are a few things you will need to know before attempting this. In our first exercise we were running 5 volts to the strip. That said, switching for SPI lines and all of the other I/O pins on a Pi are 3.3 volts. This means that the signals we were sending with our first circuit were not getting the full amplitude of the signals it was meant to receive. For a small strip of lights this will work, but as our strip gets longer and we start using an external power supply, we should convert the 3.3 volt signals to 5 volt ones. Beyond it giving each LED better data, it also allows us to have longer wire runs.

To do that, we are going to add a voltage leveler to the control lines. We are also adding in an external power supply for the LEDs that can handle the current, along with connectors. The capacitor is being added to reduce voltage spikes when you first plug in the external power supply. If we didn’t have this, a sudden voltage spike could end up burning out the first LED in the strip.

Things you will need:

rpi3_schematics_bling_bb

Once you’ve gathered up all of these parts, you will want to hook up the circuit as per the diagram above. Note:  You will need to make sure that the notch on the voltage leveler is oriented correctly to get the correct hookups. In the diagram, it is on the left side of the chip.

 

Modifying the Code

Once you have connected everything up, you will need to modify the NUM_LEDS final in the MainActivity.java file of the APA102 module to reflect the number of LEDs in your strip.


private static final int NUM_LEDS = 144;

ezgif.com-optimize (1)
Now run the module and your lights should look like the image above!

Stay tuned for the next installment where we will discuss a few common problems.

Posted in Android, Android Things, Maker, Mobile Tagged with: , ,

Android Things with Raspberry Pi: Workshop Basics Part I (Connecting to the Network)

ANDROID-THINGS

Around the beginning of 2017, Google introduced an IoT platform called Android Things, which makes it easy for Android developers to develop IoT applications. Given Polyglot Programming’s interest in both IoT and Android, we were very early adopters of this platform, back when it was called Brillo. As part of our work with this platform, we’ve been involved in hosting workshops through the Atlanta Google Developer Group to help get people up and running with it. The documentation from Google is pretty good for experienced developers, but along the way we’ve discovered some things to help you as a beginner, or as a GDG organizer wanting to host your own workshop. This post assumes that you already have a basic understanding of the platform, have flashed the image to your SD card, and that you have some knowledge about hardware in general.

Why use a Pi?

At the time of this writing, there are six hardware platforms supported by Android Things: Two from Intel, three from NXP, and one from the Pi Foundation. Intel has recently discontinued their IoT platform and the ones that you can find for sale are still pretty expensive, roughly $80. The NXP boards are also in the same price range. The Pi 3, by comparison, costs about $40. That makes it a more accessible choice for people on a budget.

Understanding the Pi

Pi3+Breakout+Feb+29+2016First of, the only officially supported Pi is the Raspberry Pi 3. We’ve heard unconfirmed reports of it working on a Pi 2, but can conclusively confirm that it will not even boot on the Pi Zero! If you look at the diagram of a Pi 3, you’ll notice that the USB power port is for power only. While USB can be used for bi-directional data transfer, at this time, the USB ports on an Android Things Pi cannot be used for ADB sessions. Therefore you will need to get your device connected to a network via an Ethernet cable or Wi-Fi.

For this option to work, you need to be on a network where peer-to-peer connections are allowed. Most home networks and mobile hotspots allow for this, but some public ones may not, for security reasons. If you have access to a physical Ethernet port, the easiest way to get this working is to hook your device up to a HDMI display. After the device boots up (with no installed apps), it will display the current IP address at the bottom of the screen. From there you can follow the instructions here to connect your device to your Wi-Fi network so that you don’t have to maintain a constant physical Internet connection. But, what happens if you don’t have access to a physical Ethernet port or HDMI display (and would it be a pain to work from these?)

USB to TTY cable/Serial debug to the rescue (sort of)

raspberrypi-consoleLuckily, you can use a USB to TTY serial cable to get to a serial debug console, which is the same thing you’d get if you were to type in

./adb shell

Note: If you plan on using a physical serial interface to your device, you’ll need to disable the ability to use this. In that scenario you’ll either need to revert to a physical Ethernet connection to change settings, or, alternatively, create another mechanism for changing the network, such as your own BLE-based configuration option.

To get started, hook the cable up as per the diagram above. Once you have the cable hooked up, you’ll need to use a terminal program to access the serial connection. If you’re on MacOS, Zterm is a good free option, or Serial. If you’re using the AdaFruit cable on a Mac, the port will be called SLABUSB_TO_UART. Go to the official Android Things documentation to get the rest of the settings. Once you have connected with the correct settings, you should see a prompt that looks something like this (you may need to hit enter once to get this):


rpi3:/ $

If you haven’t played around with the shell before, but are familiar with Linux, it might be a good time to splunk around the filesystem to see what is going on. Android, at its heart, is based on Linux.

Checking the status of your network connection

Android, like most Unix-based systems, supports the ifconfig command. Typing this on an Android Things device connected to a wired Ethernet connection, will look something like this:

wlan0 Link encap:Ethernet HWaddr b8:27:eb:54:86:07
UP BROADCAST MULTICAST MTU:1500 Metric:1
RX packets:11712 errors:0 dropped:11712 overruns:0 frame:0
TX packets:0 errors:0 dropped:0 overruns:0 carrier:0
collisions:0 txqueuelen:1000
RX bytes:5421496 TX bytes:0 

lo Link encap:Local Loopback
inet addr:127.0.0.1 Mask:255.0.0.0
inet6 addr: ::1/128 Scope: Host
UP LOOPBACK RUNNING MTU:65536 Metric:1
RX packets:354 errors:0 dropped:0 overruns:0 frame:0
TX packets:354 errors:0 dropped:0 overruns:0 carrier:0
collisions:0 txqueuelen:1
RX bytes:175044 TX bytes:175044

eth0 Link encap:Ethernet HWaddr b8:27:eb:01:d3:52
inet addr:192.168.0.96 Bcast:192.168.0.255 Mask:255.255.255.0
inet6 addr: fe80::ba27:ebff:fe01:d352/64 Scope: Link
inet6 addr: 2602:302:d15c:5fc8:ba27:ebff:fe01:d352/64 Scope: Global
inet6 addr: 2602:302:d15c:5fc8:44ec:a445:3f14:9d22/64 Scope: Global
UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1
RX packets:55690 errors:0 dropped:118 overruns:0 frame:0
TX packets:10105 errors:0 dropped:0 overruns:0 carrier:0
collisions:0 txqueuelen:1000
RX bytes:10610970 TX bytes:2274942

The lines that we’ll be focusing on are the wlan0 entry (wireless adapter) and eth0 (wired adapter). In the example above, our physical Ethernet adapter is connected to the network with an IP address of 192.168.0.96.

Connecting to a Wi-Fi network

The official documentation provides some instructions on connecting to a wireless network here. Under Step 1, there’s an important detail that they left out about the password. Specifically, that you will need to add an extra space after the password. If you don’t, the command loses the last character, and you will have continued failed attempts at connecting to the network! You can use the instructions provided by Google to verify your connection. Another trick is to use the ifconfig command from the shell/serial debug session.


wlan0 Link encap:Ethernet HWaddr b8:27:eb:54:86:07
inet addr:192.168.0.83 Bcast:192.168.0.255 Mask:255.255.255.0
inet6 addr: 2602:302:d15c:5fc8:5e7:fe2b:89c0:b51c/64 Scope: Global
inet6 addr: fe80::ba27:ebff:fe54:8607/64 Scope: Link
inet6 addr: 2602:302:d15c:5fc8:ba27:ebff:fe54:8607/64 Scope: Global
UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1
RX packets:794 errors:0 dropped:646 overruns:0 frame:0
TX packets:253 errors:0 dropped:0 overruns:0 carrier:0
collisions:0 txqueuelen:1000
RX bytes:287291 TX bytes:43241 

lo Link encap:Local Loopback
inet addr:127.0.0.1 Mask:255.0.0.0
inet6 addr: ::1/128 Scope: Host
UP LOOPBACK RUNNING MTU:65536 Metric:1
RX packets:29 errors:0 dropped:0 overruns:0 frame:0
TX packets:29 errors:0 dropped:0 overruns:0 carrier:0
collisions:0 txqueuelen:1
RX bytes:3459 TX bytes:3459

eth0 Link encap:Ethernet HWaddr b8:27:eb:01:d3:52
inet addr:192.168.0.96 Bcast:192.168.0.255 Mask:255.255.255.0
inet6 addr: fe80::ba27:ebff:fe01:d352/64 Scope: Link
inet6 addr: 2602:302:d15c:5fc8:a91f:194b:baaf:6cea/64 Scope: Global
inet6 addr: 2602:302:d15c:5fc8:ba27:ebff:fe01:d352/64 Scope: Global
UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1
RX packets:12306 errors:0 dropped:1 overruns:0 frame:0
TX packets:10998 errors:0 dropped:0 overruns:0 carrier:0
collisions:0 txqueuelen:1000
RX bytes:1107193 TX bytes:2085638

If you have successfully connected, you will see an IP address assigned under the wlan0 adapter. Now you can use adb connect <ip-address> to connect to the Pi. Once you’ve done that, you can interact with your Android Things device over ADB.

Stay tuned for our next post where we will talk about writing your first application with Android Things!

Posted in Android, Android Things, Development, Uncategorized Tagged with: , ,

In Need of Parse Alternatives? Let’s Look at Firebase!

Until recently, Parse has been the go-to solution for mobile developers who needed to persist application data to the cloud without having to set up and run a custom API backend. All of this changed in January 2016, when Parse announced that they would be ending the service in 2017. Even before Parse announced its termination, Google’s Firebase has gained popularity as an alternative. Despite the fact that Firebase does not offer a direct conversion mechanism from Parse, we decided to evaluate it for Bookdash, an open source project that we contribute to.

What is Firebase?

firebase_logo
In a nutshell: Firebase is a cloud-based NoSQL data storage service with close to real time updates. It offers features such as automatic updates, syncing, and caching mechanisms for mobile clients. In addition, it also has mechanisms for user authentication and authorization to data. All of the data can be viewed with the Firebase console in a JSON tree structure and all authentication and authorization logic is written in JSON.

What Are Its Strengths?

This list is not exhaustive, but an application with the following characteristics would be ideal for Firebase:

  • Simple data structures
  • Simple authorization and authentication rules
  • The need to quickly update data between clients that share data
  • The need to scale

Given that most mobile applications fall into one or more of these categories, Firebase makes it easy to get started and be productive.

What About Complex Data Structures?

Granted, things do get a bit more interesting when your application has complex data structures. Since Firebase is a NoSQL data store that uses data, there is nothing to stop you from storing nested data in a format. Consider the following example:


{"Books":
    [{'author' : {
        'name': 'Matz'
        'publishers' : [{
              'company_name' : 'Pearson'
              'company_addresses' : [{
                   'address1' : '1 ruby way'
                    }]
              }],
         'author website' : 'www.ruby.com'
         }}
     'title' : 'Ruby Rocks'
    }]
}

In this example, we use a nested data structure to store data about a book. Let’s say that we have a very simple application that gives users a list of books along with the respective authors. Theoretically, you could simply store the data in a nested JSON structure. At first glance, it might seem like the obvious thing to do if you’re receiving all of the data from another source in a nested feed. There are, however, two problems with this. allie_programFirst, what happens if some common biographical information about a author needs to be changed (for example, the author’s website)? With the current data structure, we’ll need to update every book that we have in the data store. This might be a reasonable trade off if we have 10 books, but what if we have 1,000 or 1,000,000 books? If we have a mobile application that writes directly to Firebase, this would clearly be data intensive and highly impractical. Secondly, if we attempt to retrieve a list of book titles and authors only, a query against the data store will also return a lot of unnecessary nested data (such as the publisher, publication date, etc.) Again, this can cause the application to have excessive data usage and slow down query times. One way to fix this would be to normalize the data.


{'Books':
   [{'author' : 1,
        'title' : 'Ruby Rocks'
   }],
'authors' :{
    1 : {
         'name': 'Matz'
         'publishers' : 1,
         'author website' : 'www.ruby.com'
        }}
    },
'publishers' : {
    1 : {
        'company_name' : 'Pearson'
        'company_addresses' : 1
        },
'company_addresses' : { 
     1 : {
          'address1' : '1 ruby way'
         }
    }
}

This resolves one problem, but now we need to query against two top level JSON objects in order to get a book title and author. Two queries will not be a large performance burden, but what if our application needs a list of the book author, title, publisher name, and publisher address? We’re up to 4 queries. Now, what if we have a normalized data structure that spans 7 or 10 objects?

Denormalization

To fix the multiple query problem, we may need to have redundant data in multiple objects. For example, we may create one top level object in Firebase that has a nested data structure and another that has the normalized structure. If Firebase is only hooked up to one web application that manages all of this, it may add a bit of complexity, but it would remain manageable. One major advantage of using Firebase is that you don’t have to stand up your own server. It has native clients for Android and iOS that do a lot of the heavy lifting around getting, putting and offline data access for your Firebase DataStore.

Many Clients

If you have only one application that can write data, the data structure may be a reasonable trade off. IMG_0454 (2)But, what if you have an iOS and Android Client that also has the ability to edit the address for Pearson? More importantly, what if there are a few thousand book titles in your repository that were published by Pearson? This would, again, create two problems. First, a mobile client might have to update a lot of records. Second, it increases the chance for out of sync data, since two codebases need to stay in sync with the updated data structure.

Using a Hybrid Approach

One possible solution is to set up your own server to manage writes between the mobile applications and Firebase store. You’d have to deal with maintaining another application stack, scaling it for traffic, and manually deal with data writes from the application. You could still have your app directly read from Firebase (which has a lot of advantages), but you would also add complexity to the system.
.

Should I Use Firebase?

Firebase is a good choice for many applications. scrumIf you have a simple data structure, it will handle a lot of the data sync and caching for you, and create a very robust back end that can deal with a lot of traffic. If you have a complex data structure or a lot of sophisticated authorization rules, things may not be as cut and dry. There are a lot of benefits to using it, and in our opinion, they often tip the scales in favor of Firebase. That said, there might be other compelling reasons, such as a desire to version your API, increase security, or integrate with other platforms, where it would be more advantageous to create a custom API on your own server instance.

Posted in Android, Architecture, Database, Development, IOS Tagged with: , , ,

Purr Programming Shirt Part 4 — Adding Android Wear Support

In the first three parts of this series we discussed how to create a wearable garment (in our case, a t-shirt) with LEDs, how to connect it to an Android device using MetaWear, and how to make it change colors and flash when someone tweets at you. Now, let’s add another fun feature: Triggering color changes and making the LEDs flash by using an AndroidWear smartwatch. In this example, the app is going to listen for a speech command, check what is said, and prompt the colors to change and flash if the command is valid.

How Does It Work?

Screen Shot 2016-04-14 at 2.38.47 AM

When you open the watch application, it launches a voice prompt. The user can state which pattern to use, such as “rubyfuza”. If the app recognizes the command, a Wearable.MessageApi message is sent. That message will then be read by an instance of WearableListenerService on the phone. The service creates a Broadcast Intent and sends it to the MetaWear Service, which, in turn, sends commands to the shirt to change colors or flash the LEDs in the specified pattern.

Setting Up the Wear App and Adding Voice Recognition

screen (3)
To get started, you need to add an AndroidWear module to your application.

When we open the application, we’re immediately going to launch a prompt for the user’s voice command. To do that, we create a ACTION_RECOGNIZE_SPEECH intent and pass it into startActivityForResult.


    override fun onCreate(savedInstanceState: Bundle?) {
        super.onCreate(savedInstanceState)
        .
        .
        .
        val intent = Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH)
        intent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_MODEL,
                RecognizerIntent.LANGUAGE_MODEL_FREE_FORM)
        // Start the activity, the intent will be populated with the speech text
        startActivityForResult(intent, SPEECH_REQUEST_CODE)

    }

When voice recognition is complete, it calls onActivityResult with the results. We’re able to read the results by overriding this method, and if it is a SPEECH_REQUEST, we can call getStringArrayListExtra on the Intent object returned in the method. The first item in the array is a string representing the user’s speech.


   override fun onActivityResult(requestCode: Int, resultCode: Int, data: Intent) {
        if (requestCode == SPEECH_REQUEST_CODE && resultCode == Activity.RESULT_OK) {
            val results = data.getStringArrayListExtra(
                    RecognizerIntent.EXTRA_RESULTS)
            val spokenText = results[0]
            Log.i("Speech ", spokenText)
        }

Sending the Command to the Phone

In order to send commands to a phone, we need to include the play-services dependency in the dependencies section of our wear build.gradle file.


compile 'com.google.android.gms:play-services-wearable:8.4.0'

In the main activity, we implement the GoogleApiClient.ConnectionCallbacks and GoogleApiClient.OnConnectionFailedListener interfaces.


class MainActivity : WearableActivity(), GoogleApiClient.ConnectionCallbacks, GoogleApiClient.OnConnectionFailedListener {
    .
    .
    .
    override fun onConnected(bundle: Bundle?) {
        Log.i("on connected", "here")
    }

    override fun onConnectionSuspended(i: Int) {

    }

    override fun onConnectionFailed(connectionResult: ConnectionResult) {
        Log.i("On conneced", "failed")
    }
}

Since AndroidWear supports multiple Wear devices paired to the same phone, you need to identify which device you want to send a message to. This can be achieved by performing an AsyncTask, called at startup, that gets a list of all connected nodes. (To keep things simple in this example, we only support the user having one Android Wear device paired to the phone.) We then store the ID in a class variable to tell the messageApi which device to send a message to.


   override fun onCreate(savedInstanceState: Bundle?) {
        super.onCreate(savedInstanceState)
        .
        .
        .
        StartWearableActivityTask().execute()
    }

   private inner class StartWearableActivityTask : AsyncTask<Void, Void, Void>() {

        override fun doInBackground(vararg args: Void): Void? {
            getNode()
            return null
        }
    }

    private fun getNode() {
        val nodes = Wearable.NodeApi.getConnectedNodes(mGoogleApiClient).await()

        if (nodes.nodes.size > 0) {
            messageNodeId = nodes.nodes[0].id
        }
    }

Finally, we use WearableApi.sendMessage to send commands to the phone.


Wearable.MessageApi.sendMessage(mGoogleApiClient, messageNodeId,
                        command, null).setResultCallback { sendMessageResult -> Log.i("Message Sent", sendMessageResult.status.isSuccess.toString()) }

Receiving the Message

In the application on our phone, we now need to register a service that implements the WearableListenerService abstract class.

<service
   android:name=".WatchDataListenerService"
   android:enabled="true">
       <intent-filter>
             <action android:name="com.google.android.gms.wearable.BIND_LISTENER" />
       </intent-filter>
</service>

When messages are received, we parse the message path and send it as the command via a broadcast intent to the PurrProgrammingNotificationListenerService, which prompts the LEDs on the t-shirt to blink and change colors.


class WatchDataListenerService : WearableListenerService() {
    val SERVICE_NOTIFICATION = "com.polyglotprogramminginc.purrprogramming.SERVICE_NOTIFICATION"

    override fun onMessageReceived(messageEvent: MessageEvent?) {
        Log.i("message received", messageEvent!!.path)
        super.onMessageReceived(messageEvent)
        val i: Intent = Intent(SERVICE_NOTIFICATION)
        i.putExtra("command", messageEvent!!.path)
        sendBroadcast(i)
    }

}

User Feedback

screen (2)

Once a command is recognized, the user gets feedback similar to the message at the right. Instead of creating a mechanism in our AndroidWear app to prompt the user for another command, we opted for a design that requires the user to close the app via a swipe motion, and reopen it to issue another command. Please note, however, that this is not something that we would recommend for a production app.

You can find the code for our app here.

Posted in Android, Android Wear, Development, Kotlin, MetaWear, Mobile, Wearables Tagged with: , , , ,

Purr Programming Shirt Part 3 – Changing Colors Based On Twitter Notifications

In the previous installment of this blog series, we talked about setting up a service to talk to the MetaWear board. In the app, we flash and change the colors of the LEDs on the t-shirt by sending broadcast intents from the activity to the service. But, wouldn’t it be cool if the colors could change when we receive tweets, likes or mentions on social media? As it turns out, Twitter can send Android notifications that we, in turn, can listen to in the app. Let’s take a look at the details of setting this up.

Screen Shot 2016-04-14 at 2.35.53 AM

Overriding NLServiceReceiver

Previously, we set up a service to override NotificationListenerService. This yields us a method called onNotificationPosted, which gets called whenever a new Android notification is received on the device. The method passes in a StatusBarNotification object. A Twitter notification always has a packageName of com.twitter.android and its .notification.extras includes a field called “android.text” that contains a string with the name of the intended Twitter account. Using this, we’re able to look at the notification and respond to Twitter notifications for specific accounts that we wish to monitor.

For example, if our Twitter client is logged in to both our @rubyfuza and @lgleasain accounts and someone tweets at @rubyfuza, the message will come in with a packageName of “com.twitter.android” and an “android.text” field under .notification.extras with the text “@rubyfuza”.


override fun onNotificationPosted(sbn: StatusBarNotification) {

  val i: Intent = Intent(ACTIVITY_NOTIFICATION);
  i.putExtra("notification_event", "onNotificationPosted :" + sbn.getPackageName() + "\n");

  if (sbn.packageName.equals("com.twitter.android")) {
    // turn into rubyfuza
    Log.i(TAG, "aaaand the text is " + sbn.notification.extras.get("android.text"))

    if (sbn.notification.extras.get("android.text").equals("@rubyfuza")){

      Log.i(TAG, "sending ruby intent")
      val i: Intent = Intent(SERVICE_NOTIFICATION)
      i.putExtra("command", "ruby")
      sendBroadcast(i)

    }
  }
}

Note that we cannot directly call MetaWear methods in this service because it will throw a threading exception. Instead, we send a broadcast intent with an extra field called “command”, which has the value “ruby”. The broadcast receiver that we talked about before will receive the intent and send a command to flash the LEDs.

In a typical application we’d probably set this up in its own service, but NLServiceReceiver requires application permissions that can be enabled via Settings->Sound&notification->Notification access. When you enable notifications for the app, it starts the service, and when you disable notifications, it ends the service. Normally, you’d add a control to the main application in order to start and stop the service. However, since this is a proof of concept wearables project, we kept them together to ensure that a bug doesn’t cause the service to consume more battery when we’re not using the shirt. You can find a video of what we have so far over here.

Stay tuned for the next post when we’ll talk about adding AndroidWear support to the app.

Posted in Android, Android Notifications, Development, Kotlin, MetaWear, Mobile, Wearables Tagged with: , , , , ,

Purr Programming Shirt Part 2 — Communicating with the MetaWear

In Part 1 of this series, we talked about the hardware for our Purr Programming t-shirt and how we connected all of the components. Now let’s talk about the Android app that controls the shirt’s LEDs.

Our app has the ability to:

  1. Give the user a series of buttons that allows him or her to flash a few different patterns on the NeoPixels, along with flashing the swirls.
  2. Flash the pixels and swirls in distinct patterns if someone tweets at Twitter accounts that are registered to receive notifications on the phone.
  3. Allow the user to use voice control on an Android Wear watch in order to manually launch the patterns when triggered by a tweet.

Screen Shot 2016-04-14 at 2.30.23 AM

Because we want to change LED patterns based on the particular Twitter account targeted, we wanted to handle MetaWear commands in a service so that the app doesn’t need to be in the foreground in order to respond to new Twitter notifications. We also decided to use Kotlin instead of Java. If you’re interested in learning more about Kotlin, this is a great resource.

Setting up the Service

In the AndroidManifest.xml we need to set up a service along with the MetaWear BLE service.

<service android:name="com.mbientlab.metawear.MetaWearBleService" />

<service
    android:name=".PurrProgrammingNotificationListenerService"
    android:label="notifications"
    android:permission="android.permission.BIND_NOTIFICATION_LISTENER_SERVICE">
    <intent-filter>
        <action android:name="android.service.notification.NotificationListenerService" />
    </intent-filter>
</service>

In this instance, we’re calling the service PurrProgrammingNotificationListenerService and it will override NotificationListenerService and ServiceConnection.


class PurrProgrammingNotificationListenerService : NotificationListenerService(), ServiceConnection {

}

Our service is going to register a broadcast receiver that will receive messages from our app activity and smart watch.


internal inner class NLServiceReceiver : BroadcastReceiver() {
    override fun onReceive(context: Context, intent: Intent) {
        if (intent.getStringExtra("command").equals("clearall")) {

    }
            
}

Instead of binding the MetaWear BLE service in Activity, we do it in onCreate for the service.


override fun onCreate() {
    super.onCreate();
    nlservicereciver = NLServiceReceiver();
    val filter: IntentFilter = IntentFilter();
    filter.addAction(SERVICE_NOTIFICATION);
    registerReceiver(nlservicereciver, filter);
    bindService(Intent(this, MetaWearBleService::class.java),
            this, Context.BIND_AUTO_CREATE)
}

Because this is a one-time event, we have hard coded the MAC address and immediately try to connect to the board once the service is bound. With our connection handler, we use logic to automatically reconnect the board if it gets disconnected. If this was a production application, we would add some logic to only connect and stay connected when actively sending commands to the board. That said, this is not efficient, as we found during our tests that, anecdotally, the battery life on a Nexus 5 took about a 20% hit.


private val connectionStateHandler = object : MetaWearBoard.ConnectionStateHandler() {
    override fun failure(status: Int, error: Throwable?) {
        super.failure(status, error)
    }
    override fun disconnected() {
        mwBoard!!.connect()
        Log.i("Purr Programming Service", "disconnected")
    }
    override fun connected() {
        Log.i("Purr Programming Service", "connected")
        if (!connected) {
            Log.i("Main Activity", "Initializing neoPixels")
            connected = true
        }
        val result = mwBoard!!.readRssi()
        result.onComplete(object : AsyncOperation.CompletionHandler() {
            override fun success(rssi: Int) {
                Log.i("RSSI is ", Integer.toString(rssi))
            }
        })
    }
}

Flashing the Swirls

The swirling LEDs are controlled individually via FETS that are connected to GPIO pins 1 and 2. These are turned on by setting the pin and turned off by clearing it.


private fun turnOnSwirl(pin: Int) {
    gpIO.setDigitalOut(pin.toByte())
}
private fun turnOffSwirl(pin: Int) {
    gpIO.clearDigitalOut(pin.toByte())
}

Setting Up the NeoPixels

First, we need to initialize the strand. In the app, the user needs to manually trigger this.


npModule = mwBoard!!.getModule(NeoPixel::class.java)
npModule.initializeStrand(STRAND, NeoPixel.ColorOrdering.MW_WS2811_GRB,
        NeoPixel.StrandSpeed.SLOW, GPIO_PIN, 8)

Once the strand is initialized, pixels are set with the setPixel command.


npModule.setPixel(STRAND, (pixels[index] + 1).toByte(),
        colors[index]!!.get(RED)!!,
        colors[index]!!.get(GREEN)!!,
        colors[index]!!.get(BLUE)!!)

We then loop through the pixels that we want to set in a method.


private fun setRubyScreen(colorIndex: Int) {
    try {
        val color: Map<String, Byte> = computerScreenRuby[colorIndex]!!
        for (pixel in 2..5) {
            npModule.setPixel(STRAND, pixel.toByte(),
                    color[RED]!!,
                    color[GREEN]!!,
                    color[BLUE]!!)
        }
    } catch(e: Exception) {
        Log.i("problem with ", e.toString())
    }
}

Next, we call these from a timer to get the flashing effects.


private fun startRubyScreen() {
    currentWorkFlow = STATE.RUBY_SCREEN
    object : CountDownTimer(30000, 2000) {
        override public fun onTick(millisUntilFinished: Long) {
            setRubyScreen((rubyIndex++).mod(3))
            setCatEye(1, rubyIndex.mod(5))
            setPaw(rubyIndex.mod(2))
        }
        override public fun onFinish() {
            flashSwirls()
        }
    }.start();
}

The main app is then wired in by sending a broadcast intent. The intent is picked up by the service, which, in turn, executes the requested light pattern on the shirt.


val i: Intent = Intent(SERVICE_NOTIFICATION)
i.putExtra("command", "eyes")
sendBroadcast(i)

The source code for our working application can be found over here. Stay tuned for the next post where we’ll talk about how to listen to notifications in the service.

Posted in Android, Development, Kotlin, MetaWear, Mobile, Wearables Tagged with: , , , ,