Android App Space http://androidappspace.com Sat, 04 Dec 2021 21:43:05 +0000 en-US hourly 1 https://wordpress.org/?v=5.8.2 /wp-content/uploads/2021/10/cropped-9d8eeca0ee38424890ad7b3591028696-1-32x32.png Android App Space http://androidappspace.com 32 32 Bitemporal History /bitemporal-history/ /bitemporal-history/#respond Sat, 04 Dec 2021 21:43:05 +0000 /bitemporal-history/ When we think of how some property (e.g. your address or salary) changes over time, we usually think of it as a linear sequence of changes. But surprisingly often, it can get rather more tangled than that, in a way that can often confuse computerized records. I can illustrate all this with a simple example:…

The post Bitemporal History first appeared on Android App Space.

The post Bitemporal History appeared first on Android App Space.

]]>

When we think of how some property (e.g. your address or
salary) changes over time, we usually think of it as a linear sequence of
changes. But surprisingly often, it can get rather more tangled than that,
in a way that can often confuse computerized records.

I can illustrate all this with a simple example:

  • We are handling payroll for our company. We run the payroll for our
    company on February 25, and our employee Sally is paid according to her
    monthly salary of $6000.
  • On March 15 we get an apologetic letter from HR telling us that, on
    February 15th, Sally got a pay raise to $6500.

So what should we answer when we’re asked what Sally’s salary was on
February 25? In one sense we should answer $6500, since we know now that that
was the rate. But often we cannot ignore that on Feb 25 we thought the salary
was $6000, after all that’s when we ran payroll. We printed a check, sent it to
her, and she cashed it. These all occurred based on the amount that her salary
was. If the tax authorities asked us for her salary on Feb 25, this becomes
important.

The Two Dimensions

I find I can make sense of much of this tangle by thinking of time as
two dimensions – hence the term “bitemporal”. One dimension is the
actual history of Sally’s salary, which I’ll illustrate by sampling on the
25th of each month, since that’s when payroll runs.

date salary
Jan 25 6000
Feb 25 6500
Mar 25 6500

The second dimension comes in as we ask what did we think Sally’s salary
history was on February 25? On February 25th we hadn’t got the letter from
HR, so we thought her salary was always $6000. There is a difference between
the actual history, and our record of the history. We can show this by
adding new dates to our table

record date actual date salary
Jan 25 Jan 25 6000
Feb 25 Jan 25 6000
Mar 25 Jan 25 6000
Feb 25 Feb 25 6000
Mar 25 Feb 25 6500
Mar 25 Mar 25 6500

I’m using the terms actual and record history for the
two dimensions. You may also hear people using the terms valid, or
effective (for actual) and transaction (for record).

I read the rows of this table by saying something like “on Mar 25th, we
thought Sally’s salary on Feb 25th was $6500”.
Using this way of thinking, I can look at the earlier table of Sally’s actual history,
and say that more precisely it’s Sally’s actual history as known (recorded)
on March 25.

In programming terms, If I want to know Sally’s
salary, and I have no history, then I can get it with something like
sally.salary. To add support for (actual) history I need to use
sally.salaryAt('2021-02-25'). In a bitemporal world I need
another parameter sally.salaryAt('2021-02-25', '2021-03-25')

Another way to visualize this is to make a plot where the x axis is
actual time and the y axis is record time. I shade the region according to the
salary level. (The shape of the plot is triangular since there’s we’re not
trying to record future values.)

With this plot, I can make a table for how actual history changes with
each run of payroll on the 25th. We see that the Feb 25 payroll ran at a
time when Sally had no raise, but when the Mar 25 payroll ran, the raise was
known.

Changing the Retroactive Change

Now consider another communication from HR

  • April 5: Sorry there was an typo in our previous email. Sally’s raise
    on Feb 15 was to $6400. Sorry for the inconvenience.

This is the kind of change that makes angels weep. But when we think of
it terms of bitemporal history, it’s not that difficult to understand.
Here’s the plot with this new bit of information.

The horizontal lines, used for the payrols, represent the actual history
at a certain point in record time. On April 25 we know Sally’s salary
increased from $6000 to $6400 on February 15. In that perspective, we never
see Sally’s $6500 salary because it was never true.

Looking at the diagram, what does a vertical line mean?

This represents our knowledge of the value at a certain date. The table
indicates the recorded salary for February 25th, as our knowledge
changed over time.

Using Bitemporality

Bitemporal history is a useful way of framing history when we have to deal with
retroactive changes. However we don’t see it used that often, partly
because many people don’t know about the technique, but also because we
can often get away without it.

One way to avoid it is to not support retroactive changes. If your
insurance company says any changes become in force when they receive your
letter – then that’s a way of forcing actual time to match record
time.

Retroactive changes are a problem when actions are based on a
past state that’s retroactively changed, such as a salary check being sent
out based on a now-updated salary level. If we are merely recording a
history, then we don’t have to worry about it changing retroactively – we
essentially ignore record history and only record actual history. We may
do that even when we do have invariant action if the action is recorded in
such a way that it records any necessary input data. So the payroll for
Sally could record her salary at the time it issues the check, and that’s
enough for audit purposes. In that situation we can get away with only the
actual history of her salary. The record history is then buried inside her
payroll statements.

We may also get away with only actual history if any retroactive
changes are made before an action occurs. If we had learned of Sally’s
salary change on February 24th, we could adjust her record without running
into before the payroll action relied on the incorrect figure.

If we can avoid using bitemporal history, then that’s usually
preferable as it does complicate a system quite significantly. However
when have to deal with discrepancies between actual and record history,
usually due to retroactive updates, then we need to bite the bullet. One
of the hardest parts of this is educating users on how bitemporal history
works. Most people don’t think of a historical record as something that
changes, let alone of the two dimensions of record and actual history.

Append-only History

In a simple world a history is append-only. If communication is perfect
and instantaneous than all new information is learned immediately by every
interested actor. We can then just treat history as something we add to as
new events occur in the world.

Bitemporal history is a way of coming to terms that communication is
neither perfect nor instantaneous. Actual history is no longer
append-only, we go back and make retroactive changes. However record
history itself is append only. We don’t change what we thought we
knew about Sally’s salary on Feb 25. We just append the later knowledge we
gained. By layering an append-only record history over the actual history, we
allow the actual history to be modified while creating a reliable history
of its modifications.

Consequences of Retroactive Changes

Bitemporal history is a mechanism that allows us to track how a value
changes, and it can be extremely helpful to be able ask
sally.salaryAt(actualDate, recordDate). But retroactive
changes do more than just adjust the historical record. As the expert
says: “People assume that time is a strict progression of cause to effect,
but actually from a non-linear, non-subjective viewpoint – it’s
more like a big ball of wibbly wobbly timey wimey stuff.” If we’ve paid Sally $6000 when we should have paid her
$6400, then we need to make it right. At the very least that means getting
more in a later paycheck, but it may also lead to other consequences.
Maybe the higher payment means she should have crossed some important
threshold a month earlier, maybe there are tax implications.

Bitemporal history alone isn’t enough to figure out these dependent
effects are, that demands a set of additional mechanisms, which are beyond
the scope of this pattern. One measure is to create a parallel model,
which captures the state of the world as it should have been with the
correct salary, and use this to figure out the compensating changes.
Bitemporal history can be useful element
for these kinds of measures, but only unravels part of that big ball.

Perspectives for Record Time

My example above for record time uses dates to capture our changing
understanding of actual history. But the way we capture record history can
be more involved than that.

To make everything easier to follow above, I sampled the history on the
payroll dates. But a better representation of a history is to use date
ranges, Here’s a table to cover 2021

record dates actual dates salary
Jan 1 – Mar 14 Jan 1 – Dec 31 6000
Mar 15 – Apr 4 Jan 1 – Feb 14 6000
Mar 15 – Apr 4 Feb 15 – Dec 31 6500
Apr 5 – Dec 31 Jan 1 – Feb 14 6000
Apr 5 – Dec 31 Feb 15 – Dec 31 6400

We can think of Sally’s salary being recorded with a combination of two
keys, the actual key (a date range) and the record key (also a date
range). But our notion of record key can be more complicated than that.

One obvious case is that different agents can have different record
histories. This is clearly the case for Sally, it took time to get
messages from the HR department to the Payroll department, so the record
times for those modifications to actual history will differ between the
two.

department record dates actual dates salary
HR Jan 1 – Feb 14 Jan 1 – Dec 31 6000
HR Feb 15 – Dec 31 Jan 1 – Feb 14 6000
HR Feb 15 – Dec 31 Feb 15 – Dec 31 6400
Payroll Jan 1 – Mar 14 Jan 1 – Dec 31 6000
Payroll Mar 15 – Apr 4 Jan 1 – Feb 14 6000
Payroll Mar 15 – Apr 4 Feb 15 – Dec 31 6500
Payroll Apr 5 – Dec 31 Jan 1 – Feb 14 6000
Payroll Apr 5 – Dec 31 Feb 15 – Dec 31 6400

Anything that can record a history will have its own record
timestamps for when it learns information. Depending on that data we may
say that an enterprise will choose a certain agent to be the defining
agent for recording certain kinds of data. But agents will cross lines of
authority – however big the company, it won’t change the recording dates of
the tax authorities it deals with. A lot of effort goes into sorting out
problems caused by different agents learning the same facts at different
times.

We can generalize what’s happening here by combining the notion of the
department and record date range into a single concept of a perspective.
Thus we’d say something like “according to HR’s perspective on Feb 25,
Sally’s salary was $6400”. In a table form, we might visualize it like
this.

perspective actual dates salary
HR, Jan 1 – Feb 14 Jan 1 – Dec 31 6000
HR, Feb 15 – Dec 31 Jan 1 – Feb 14 6000
HR, Feb 15 – Dec 31 Feb 15 – Dec 31 6400
Payroll, Jan 1 – Mar 14 Jan 1 – Dec 31 6000
Payroll, Mar 15 – Apr 4 Jan 1 – Feb 14 6000
Payroll, Mar 15 – Apr 4 Feb 15 – Dec 31 6500
Payroll, Apr 5 – Dec 31 Jan 1 – Feb 14 6000
Payroll, Apr 5 – Dec 31 Feb 15 – Dec 31 6400

What does this collapse into a single perspective concept give us? It
allows us to think about what other perspectives might be. One example is
to consider alternative perspectives. We could create a perspective where
we remove individual raises (such as Sally’s on Feb 15) and give
every employee a salary raise of 10% on March 1st. That would lead to a new
record-time dimension for Sally’s salary.

perspective actual dates salary
real world Jan 1 – Feb 14 6000
real world Feb 15 – Dec 31 6400
with global raise Jan 1 – Feb 28 6000
with global-raise Mar 1 – Dec 31 6600

This generalization of the notion of record time says that we can layer
multiple perspectives over an actual history, using essentially the same
mechanism to reason about retroactive changes and alternative histories.

Putting many perspective dimensions over a history isn’t something
that’s widely useful, even compared to bitemporal history. But I find it a
helpful way to think about these kinds of situations: reasoning about
alternative scenarios, either historically, or in the future.

Storing and Processing Bitemporal Histories

Adding history to data increases complexity. In a bitemporal world I
need two date parameters to access Sally’s salary –
sally.salaryAt('2021-02-25', '2021-03-25'). We can simplify
access by defaults, if we treat the default for record time as today, then
processing that only needs current record time can ignore the bitemporal
complications.

Simplifying access, however, doesn’t necessarily simplify storage. If
any client needs bitemporal data, we have to store it somehow. While there
are some databases that have built-in support for for some level of
temporality, they are relatively niche. And wisely, folks tend to be
extra-wary of niche technologies when it comes to long lived data.

Given that, often the best way is to come up with our own scheme. There
are two broad approaches.

The first is to use a bitemporal data structure: encoding the necessary date
information into the data structure used to store the data. This could
work by using nested date range objects, or a pair of start/end dates in a
relational table.

record start record end actual start actual end salary
Jan 1 Mar 14 Jan 1 Dec 31 6000
Mar 15 Apr 4 Jan 1 Feb 14 6000
Mar 15 Apr 4 Feb 15 Dec 31 6500
Apr 5 Dec 31 Jan 1 Feb 14 6000
Apr 5 Dec 31 Feb 15 Dec 31 6400

This allows access to all the bitemporal history, but is awkward to
update and query – although that can be made easier by making a library
handle access to bitemporal information.

The alternative is to use event
sourcing. Here we don’t store the state of Sally’s salary as our
primary store, instead we store all the changes as events. Such events
might look like this

record date actual date action value
Jan 1 Jan 1 sally.salary 6000
Mar 15 Feb 15 sally.salary 6500
Apr 5 Feb 15 sally.salary 6400

Pay attention to the fact that for if events need to support bitemporal
history, they need to be bitemporal themselves. This means each event
needs an actual date (or time) for when the event occurred in the world,
and a record date (or time) for when we learned about it.

Storing the events is conceptually more straightforward, but requires
more processing to answer a query. However much that processing can cached
by building a snapshot of the application’s state. So if most users of
this data only required current actual history, then we could build a data
structure that only supports actual history, populate it from
the events, and keep it up to date as new events trickle in. Those users
who wanted bitemporal data could create a more complex structure and
populate it from the same events, but their complexity wouldn’t make
things harder for those who wanted the simpler model. (And if some people
wanted to look at actual history on a different record date, they could
use almost all the same code for working with current actual history.)


The post Bitemporal History first appeared on Android App Space.

The post Bitemporal History appeared first on Android App Space.

]]>
/bitemporal-history/feed/ 0
The surprising truth about re-platforming databases to public cloud /the-surprising-truth-about-re-platforming-databases-to-public-cloud/ /the-surprising-truth-about-re-platforming-databases-to-public-cloud/#respond Wed, 24 Nov 2021 21:16:17 +0000 /the-surprising-truth-about-re-platforming-databases-to-public-cloud/ Enterprises are moving full steam to the public cloud unencumbered by what is happening in the economy. If anything, volatility due to Covid has raised the importance of cloud benefits and the prospect of flexibility and scalability has further accelerated this movement. Enterprises no longer view public cloud as merely Infrastructure as a Service (IaaS).…

The post The surprising truth about re-platforming databases to public cloud first appeared on Android App Space.

The post The surprising truth about re-platforming databases to public cloud appeared first on Android App Space.

]]>

Enterprises are moving full steam to the public cloud unencumbered by what is happening in the economy. If anything, volatility due to Covid has raised the importance of cloud benefits and the prospect of flexibility and scalability has further accelerated this movement. Enterprises no longer view public cloud as merely Infrastructure as a Service (IaaS). Instead, they are looking for a highly integrated enterprise platform.

Data is the heart of the matter. In particular, data warehousing has emerged as the backbone of cloud data strategies. Every CIO must now solve the challenge of re-platforming their workloads from on-prem systems to cloud native ones.

Curiously, instead of new vendors sweeping the floor with their legacy counterparts, a very different dynamic is emerging. For example, Snowflake, who positioned itself as the #1 cloud migrator recently had to admit that it isn’t that easy, after all. 

On the other hand, an incumbent boldly proclaimed that vendor lock-in will keep them in business for a very long time. And their surging stock price suggests that analysts may see it the same way.

What’s going on here? Why is there not more turnover in the database market if it is of such high value? The answer is as simple as disheartening: moving between databases is a cruelly difficult business. Fueled by overly optimistic advertising, many simply underestimate the challenge. 

Database migrations have abysmal track record

The industry has long grappled with the problem of migrating between databases. Not surprisingly, enterprises suffer when vendor lock-in holds them back from tapping new technology and innovation. Talk to any enterprise IT leader, and you learn that a considerable amount of time and money is spent continually on trying to keep up with new technology developments and moving from old to new.

More concretely, any proposal to move workloads on a mid-range data warehouse system of one vendor to another is an eye-watering experience. The typical estimates are upward of 3 years of time with a price tag of at least $20m. And that’s just the opening gambit. Once the migration is underway, things very often spiral out of control: $20m becomes $50m, and 3 years becomes 5 years. Finally, a new CIO just puts an end to it altogether to stop the bleeding.

For each successful migration, there are about 6-10 failed ones. Even the successful ones are not always convincing. Often, a successful migration is little more than a partial offloading where the legacy system continues to run complex workloads that were just too difficult to move. The result is an ever-increasing fragmentation of the IT landscape within the enterprise and with it increasing technical debt.

Application rewrite is the true problem

Fueled by grand statements made by vendors of database migration tools, customers often fall into the trap of thinking that transferring the content from the old to the new system is the problem. It is a critical part, no question. But it’s just a small fraction of the cost.

The lion’s share of the pain—and the cost—comes from rewriting applications. Applications need to be adjusted to make them work with the database. Vendor-specific SQL, tools and utilities have found their way into every crevice of the enterprise. Even inside 3rd party systems, custom SQL was a critical element of accelerating the business in the past. However, what once was a competitive advantage has turned into a liability.

None of this should come as a surprise. So why then the high failure rate? Shouldn’t we, as an industry, know better by now? First, until recently there simply wasn’t an alternative, so we just soldiered on. Second, the problem is extremely treacherous: the first 80% of the migration are often a walk in the park and trick folks into believing they are about done. It’s the last 20% that kill migrations.

The difficulty of the 20% comes from the tight interplay between application and database content. Any compromise during content transfer (lack of data types, lack of support of Stored Procedures, etc.) exponentially increases the difficulty of the application rewrite. Staging a migration to convert content first, and applications independently afterward is a recipe for failure.

Bespoke solutions are non-solutions

Today, the industry implements bespoke solutions. This is fancy-talk for cutting corners. Yet, we’ve been doing this so long that it has become folklore. Ask any IT leader and they will associate database migrations with failed projects that overran their budget and were way behind schedule by the time they got killed off.

However, as database technology becomes more commoditized, there is less and less room for these bespoke solutions. The enterprise that doesn’t have to resort to a bespoke solution, can run faster, and outperform its competitors. The pace of tech adoption then truly becomes a competitive advantage.

Time for a new paradigm

Much effort is being devoted to speed up migrations: automatic code conversion is an area of intense development. However, instead of speeding up something known to be ultimately insufficient, a new paradigm is in order. As Henry Ford said: “If I had asked people what they wanted, they would have said ‘faster horses’”.

Similarly, the database industry needs to break out of the cycle of old approaches that haven’t been delivered. The problem isn’t new: other areas of IT had exactly the same challenge. Practically every one of them has been redefined in the past 20 years by virtualization. From server virtualization to storage and network, the concept of virtualization has eviscerated migration challenges across the board.

With the industry-wide need to migrate database systems to the cloud, virtualization—the disintermediating of applications and database systems—is the logical next step. While still a young discipline, the first products are already on the market with more under active development. The future where applications can move seamlessly between databases has just begun.

The post The surprising truth about re-platforming databases to public cloud first appeared on Android App Space.

The post The surprising truth about re-platforming databases to public cloud appeared first on Android App Space.

]]>
/the-surprising-truth-about-re-platforming-databases-to-public-cloud/feed/ 0
Flutter load data into listview using GetX library /flutter-load-data-into-listview-using-getx-library/ /flutter-load-data-into-listview-using-getx-library/#respond Mon, 22 Nov 2021 16:05:02 +0000 /flutter-load-data-into-listview-using-getx-library/ GetX Listview : In previous tutorial we have seen the implementation of fetching data using GetX library in this blog we will be going through the process of populating data in flutter app using GetX listview. We have used provider which will go through the widget tree until it will find the value, controller and…

The post Flutter load data into listview using GetX library first appeared on Android App Space.

The post Flutter load data into listview using GetX library appeared first on Android App Space.

]]>

GetX Listview :

In previous tutorial we have seen the implementation of fetching data using GetX library in this blog we will be going through the process of populating data in flutter app using GetX listview.

We have used provider which will go through the widget tree until it will find the value, controller and data binding in order to fetch the data and populate to listview.

We can populate data dynamically using this listview widget follow this GetX tutorial till the end for more detailed information.

 

 

 

pubspec.yaml :

Add get dependency in the pubspec.yaml to implement GetX Listview.

dependencies:
  flutter:
    sdk: flutter
  get: ^4.3.4

 

provider.dart :

Specify the provider class which is extending the GetConnect and implement the getUser() async method.Where we are specifying the API inside this method.

And based on the response we will be populating the Listview, if there is any error then display a error message.

 

import 'package:get/get.dart';

class Provider extends GetConnect

  Future<List<dynamic>> getUser() async
    final response = await get('https://randomuser.me/api/?results=10');
    if(response.status.hasError)
      return Future.error(response.statusText!);
     else 
      return response.body['results'];
    
  


 

datacontroller.dart :

Specify the data controller class which is extending GetX controller.Inside which we are specifying onInit() and onClose().

Using which we can fetch the data and populate.

 

import 'package:flutter_fetch_data_getx/provider/provider.dart';
import 'package:get/get.dart';

class DataController extends GetxController with StateMixin<List<dynamic>>

  @override
  void onInit() 
    super.onInit();
    Provider().getUser().then((value) 
      change(value, status: RxStatus.success());
    ,onError: (error)
      change(null,status: RxStatus.error(error.toString()));
    );
  

  @override
  void onClose() 
    // TODO: implement onClose
    super.onClose();
  

 

databinding.dart :

Initialize the controllers inside the data binding.Using which we can access the data through out the app.

 

import 'package:flutter_fetch_data_getx/controller/datacontroller.dart';
import 'package:get/get.dart';

class DataBinding extends Bindings
  @override
  void dependencies() 
    Get.lazyPut(() => DataController());
  


 

route.dart :

Specify the routes i.e., the screens to be navigated are declared in this screen.Here we can also specify the bindings associated with the screen.

Add the pages as required in this routes class.

 

import 'package:flutter_fetch_data_getx/bindings/databinding.dart';
import 'package:flutter_fetch_data_getx/data.dart';
import 'package:get/get.dart';

class Routes

  static final routes = [

    GetPage(
      name: '/data',
      page: () => Data(),
      binding: DataBinding(),
    ),


  ];

 

main.dart :

Declare the GetX library specify the initial routes and get pages.

 

import 'package:flutter/material.dart';
import 'package:flutter_fetch_data_getx/routes/route.dart';
import 'package:get/get.dart';

void main()
  runApp(MyApp());


class MyApp extends StatelessWidget 
  const MyApp(Key? key) : super(key: key);

  @override
  Widget build(BuildContext context) 
    return GetMaterialApp(
      initialRoute: '/data',
      getPages: Routes.routes
    );
  

 

data.dart :

Populate the data onto the screen using this class to implement GetX Listview. Here we are extending the GetView and specify the data controller.

 

import 'package:flutter/material.dart';
import 'package:flutter_fetch_data_getx/controller/datacontroller.dart';
import 'package:get/get.dart';

class Data extends GetView<DataController> 
  @override
  Widget build(BuildContext context) 
    return Scaffold(
      appBar: AppBar(
        title: Text('GetX Network Call'),
      ),
      body: controller.obx(
          (data) => Center(
            child: ListView.builder(
                itemCount: data!.length,
                itemBuilder: (BuildContext context, int index)
              return Card(
                child: Column(
                  children: [
                    ListTile(
                      title: Text(data[index]['name']['first']),
                      subtitle: Text(data[index]['name']['last']),
                      leading: CircleAvatar(
                        backgroundImage: NetworkImage(
                          data[index]['picture']['thumbnail']
                        ),
                      ),

                    )
                  ],
                ),
              );
            )
          )
      ),
    );
  



Output :

This screen depicts the usage GetX Listview usage.

GetX Listview

 

If you have any query’s on this tutorial on GetX Listview do let us know in the comment section below.If you like this tutorial do like, share us for more interesting content.

 

The post Flutter load data into listview using GetX library first appeared on Android App Space.

The post Flutter load data into listview using GetX library appeared first on Android App Space.

]]>
/flutter-load-data-into-listview-using-getx-library/feed/ 0
Upcloud VPS Review – Features, Pricing and Alternatives /upcloud-vps-review-features-pricing-and-alternatives/ /upcloud-vps-review-features-pricing-and-alternatives/#respond Sun, 14 Nov 2021 20:46:18 +0000 https://androidappspace.com/upcloud-vps-review-features-pricing-and-alternatives/ Upcloud is a finland based cloud service provider that offers hosting solutions as cheap as $5 per month. Upcloud hosting is one of the best alternatives to DigitalOcean, Linode, AWS, Azure, Skysilk, Interserver & Vultr. In the fast paced world, it is very important that your website does not take much time to load even…

The post Upcloud VPS Review – Features, Pricing and Alternatives first appeared on Android App Space.

The post Upcloud VPS Review – Features, Pricing and Alternatives appeared first on Android App Space.

]]>

Upcloud is a finland based cloud service provider that offers hosting solutions as cheap as $5 per month. Upcloud hosting is one of the best alternatives to DigitalOcean, Linode, AWS, Azure, Skysilk, Interserver & Vultr. In the fast paced world, it is very important that your website does not take much time to load even on the devices with less internet speed. As such, it is crucial for the webmasters to deploy their websites on the servers nearest to the locations from where their visitors mostly visit apart from choosing the fast hosting solution.

Upcloud VPS Review Does it really worth it

I remember the old days when I used to open five to six web pages from the search results and continued with the one which loaded faster(exceptions exist). That was the time when faster was the winner. Much of that has not been changed yet. Even now with fast internet speed, we have witnessed to see the growth of video content. As such, many websites use embed video and social media content, loads of Javascript libraries and what not to make it very heavy. So, as of today not only a better server can help you get the speed but optimization can. In this post, I will review Upcloud hosting service and also tell you how you can optimize your website and server for better performance.

Upcloud Hosting Features: Review Highlights

Secure and Reliable

Upcloud offers hosting service that is secure and is capable of limiting the traffic by IP address or Port with a firewall. It offers scheduled simple and flexible backup options that make it highly reliable.

100% Uptime

Uptime is very important for any project you may have. You can use any uptime monitoring tool such as Screpy to monitor the uptime. Upcloud gives its users 100% Uptime SLA and a 50x payback for any downtime over 5 minutes.

24/7 Customer Support

Customer support is very crucial for determining the strength of a hosting company. Upcloud is among those that offer you quick response for your queries. Their customer support is very reliable. There is live chat option enabled on their website as well. So, when you put your query, a support person is immediately assigned to the conversation to assist you with your questions. Upcloud uses intercom live chat to assist its customers.

Highly Scalable

Most of the web owners switch from shared hosting or low potential hosting to a cloud VPS hosting when they have scalable plan. For all those enthusiasts, Upcloud is really like a boon. Their cloud infrastructure assures you of the best performance and uptime. Considering the scalability, Upcloud is one of the best hosting alternatives you can think of.

Flexible Hosting

Upcloud is definitely a flexible hosting solution. You can at any time opt to resize your server which includes downgrading and upgrading of your existing server. In order to resize your existing server, you first need to shutdown your server which is otherwise than deletion and takes only 2-3 minutes.

You can either choose from the standard resizing plans or opt to customize your plan under flexible plan option where you can manually choose the CPU, Memory, RAM or Storage.

Cost Effective Hosting Solution

Upcloud charges their clients on the hour basis for the servers. As such, they have complete flexibility on resizing the server or testing a new server without having to worry about the high cost.

2-months free migration period

With upcloud, you don’t have to worry about the extra costs of migration from a different cloud service provider. Upcloud charges you no service costs for the migration, for a 2 month free migration period. However, this offer is available for the entities the infrastructure of whose costs not less than $500 per month on Upcloud.

Designed for the developers

With Upcloud it is easy to use the control panel and API that let users code more reliably.

Upcloud Designed for Developers

Performance with MaxIOPS

MaxIOPS block storage technology leaves the competition behind and doesn’t throttle performance depending on how much you pay. It is far better than SSD storages. It gives you high performance and reliability. With the cloud technology, you can create your server within a minute. With MaxIOPS, you can also move one MaxIOPS block storage device to an another cloud server. Even it is very flexible as you can customize the amount of storage space you need.

Upcloud Object Storage

The hosting platform also offers fast to deploy and easy to manage Object Storage. One can use it deliver the content worldwide. With Upcloud Object storage you get upto 1 TB of Storage and 2TB of Transfer for the monthly cost of $20.

Upcloud Object Storage

Upcloud UI/Interface

Upcloud website has a modern interface that is easy to understand and faster to navigate. Once you open up your dashboard, you get to know everything very clearly. Wherever there is any strong term used, there is knowledgebase link there from where you can get to know a lot of information about that. Any new user can easily deploy a new server, resize, create automated backups, add team members

Upcloud Backups

With Upcloud, you can create backups by paying additional costs for your server. These automated backups can be purchased both at the time of new server deployment and at any time after the server is created. There are three backup plans one can opt for viz. Week Plan, Month Plan, and Year plan.

Week Plan Month Plan Year Plan
Daily Backups for 7 Days Daily Backups for 7 days Daily Backups for 7 days
No Weekly backup Weekly Backups for 4 Weeks Weekly Backups for 4 Weeks
No Monthly Backup No Monthly Backup Monthly Backups for 1 Year
Price on top of Plan: 20% Price on top of Plan: 40% Price on top of Plan: 60%
$1 per month for $5
per month server Plan (price vary based on server selected)
$2 per month for $5
per month server Plan (price vary based on server selected)
$3 per month for $5
per month server Plan (price vary based on server selected)

Upcloud Advanced Firewall

Upcloud offers an advanced firewall that limits server traffic. It ensures advanced protection from various hacking attempts. To make use of a firewall, you need to add incoming and outgoing traffic rules. Like automatic backups, a Firewall also costs you excess cost to your normal server cost. The firewall works based on rules on a first match basis from top to bottom.

Upcloud Firewall

Server Tags

Upcloud allows you to organize your servers. As such, you can give a tag to any of your existing servers. With the help of these tags, you can segregate your various project and even make use of them while giving permissions to members inside your workspace.

Add Team Members

Add Members in Upcloud Workspace

Like Linode, DigitalOcean, and many other hosting platforms, Upcloud too allows you to add trusted members to whom you can give various permissions. You have control over whether to give them access to all or any selected property for servers, private networks, storage, and tags.

Server Statistics

Upcloud Statistic Report for Server

You can see the stats for your server inside your server overview dashboard. It shows you statistics graph for 24 hours, 1 week, 1 month and last 1 year.

How to Deploy a new server on Upcloud

Deploying a new server on Upcloud is really easy. Follow these simple steps with me:

  1. Sign in or Create a new account Create an Account Upcloud
  2. Once you have finished creating your account after adding your right billing information you will be asked to at least add $10 to claim the promotion credits. You can also be asked to validate your details via email so keep a track on your email id. (Note: Some users claim that their server is stopped without their notice and they also give bad reviews on the online forums for the same. But this usually happens because they miss an important email from upcloud that is often sent to validate your information)Upcloud Initial Payment to claim promo credits
  3. When you finally have been credited with the promotion credits you have not less than $35 in your balance depending on your initial addition. Now that is enough to help you start deploying a server.Upcloud Review Payments
  4. Now go to your upcloud dashboard: https://hub.upcloud.com/ where you will be redirected to servers section by default where you can find ‘deploy’ button. You can also go to “https://hub.upcloud.com/deploy” directly to deploy a new server on Upcloud.
  5. You will be redirected to deployment page which is further divided into multiple sections viz. Location, Plan, Storage, Automatic Backups, Operating System, Optionals and SSH Keys. From the next points, lets discuss about each of these sections.
  6. Location: Here you need to basically choose the location of your server. You can choose to select from where your traffic comes or you expect to come. It is also possible that you don’t find the location as desired. In such case, you can opt to choose the nearest location. For say, if your traffic mostly comes from India, you can choose Singapore location. But it is always your choice, it is not mandatory to do so. However, it matters in terms of performance.Upcloud Deploy new server Choose Location
  7. Plan: You need to choose the plan for your server. Either you can opt for simple plan or you can choose a flexible plan that lets you customize features such as CPU cores and memory as per your requirements. If you are buying server for a new website, you can go with $5 per month plan or $10 if it is a big website with lots of fuctions and for which performance is crucial. You can resize your plans later if you find any additional requirements.Pricing UpcloudFlexible Pricing Upcloud
  8. Storage: In this section you can choose amount of storage capacity you require for your project. However, in Simple plan you already opt storage, CPU, RAM in your plan, so you can’t customize the storage in this section if you have opted for any simple plan above. As such you can only choose where to go with VirtIO, IDE or SCSI where the default VirtIO is best based on performance tests.Storage UpcloudBy clicking on the name, you can also rename your storage device.Name your storage device Upcloud
  9. Automated Backups: In this section you get an option to enable automated backups. It is definitely a great feature to avoid any data loss or to avoid any unexpected loss of data. Upcloud gives you three backup options to choose from viz. Week plan, Month Plan & Year Plan. In week plan, you have technically 1 week backup always with you and in monthly plan you have last month data backed up. In the last yearly plan, you have last 12 months data backed up. However, you should choose to enable this option only if you want to or need to as it will increase your cost.Automated Backups Upcloud Server
  10. Operating System: The next section asks to choose the desired operating system for your new server. You can opt for an OS based on your website requirement. For example, if you need to host a wordpress website you can go with Ubuntu Server, Plesk or Windows Server for an ASP.net website. By default, you select the latest version of any operating system whereas you can change it to older version by clicking on the change version dropdown below them. It is always better to choose the latest or default version for new website. However, if you want to move then you can match the versions.Upcloud Operating System
  11. Optionals: The next section allow you to choose whether to add a public IPv6 address to your server and also enables you to edit metadata with respect to your server. Upcloud Optionals IPv6 Support and Metadata Service
  12. At the last, it asks you to put SSH keys which is optional. You can also change the hostname and description before you deploy your very first new server on Upcloud. Upcloud Deploy Server SSH

How To Resize an Existing Upcloud Server

Once you have deployed the server and you want to upgrade or degrade your existing server you can do that by resizing your server. To resize your server you first need to go to your Upcloud Dashboard and click on the existing server.

Upcloud Servers List Dashboard

On the overview tab, you can see the Configuration section where you can find the specifications of your existing server. Below that, there is resize button to manipulate the server specifications.

Resize Server Upcloud

Once you click that button it shows you the below window. To resize your server, you will first need to shut down your server first. Once your server is shutdown, the resize section containing various plans will become active and you can choose the plan of your choice.

Resize Upcloud Server Storage

You can also customize new server under flexible plan.

Server Resize on Upcloud

Resizing will take a couple of minutes. You will get a notification and also an email updating you about the new server up-gradation/degradation status. So, this is how you can easily resize your existing server on Upcloud within a matter of minutes.

Upcloud Pricing

The simple pricing plans of Upcloud hosting are listed below:

CPU RAM/Memory Storage Transfer Pricing
1 1 GB 25 GB 1 TB $5 /mo
1 2 GB 50 GB 2 TB $10 /mo
2 4 GB 80 GB 4 TB $20 /mo
4 8 GB 160 GB 5 TB $40 /mo
6 16 GB 320 GB 6 TB $80 /mo
8 32 GB 640 GB 7 TB $160 /mo
12 48 GB 960 GB 9TB $240 /mo
16 64 GB 1280 GB 10 TB $320 /mo
20 96 GB 1920 GB 12 TB $480 /mo
20 128 GB 2048 GB 24 TB $640 /mo

Is Upcloud better than DIgitalOcean and Linode?

Competitively, Upcloud can be compared with DigitalOcean and Linode in terms of performance and infrastructure. Upcloud doesn’t disappoint you if you move from any of the two.

Is Upcloud worth the money?

Considering many of the great features such as hourly payment, easy to deploy and manage, scalability, flexibility and performance, one can say that Upcloud hosting is a great value for money.

Conclusion

Without a single doubt, Upcloud is one of the fastest and reliable hosting service providers as of today. With the state of the art technology and cloud infrastructure, it is able to suffice the need of its users who give importance to performance. Upcloud lets you quickly deploy servers, resize them anytime, take automatic backups, put a firewall, manage members for your workspace, and all that with an intuitive layout. With ready-to-scale and highly flexible services, Upcloud let’s control your cost at your staging time and quickly manipulates as you fly. Conclusively, you can concentrate more on your projects.

The Review


Upcloud Hosting

PROS

  • Clean and lovely user-interface.
  • Fast servers
  • 100% uptime SLA
  • Very-friendly customer support
  • Extensive knowledge base
  • Easy scaling
  • Server Level Firewall
  • Automated Backup Options
  • Quick server deployment

CONS

  • Not very friendly for rookies.
  • No option for one-click application deployment
  • Low number of data center

Review Breakdown

  • Flexible
    0%

  • Cost Efficient
    0%

  • Easy to use
    0%

  • Scalability
    0%

  • Performance
    0%

The post Upcloud VPS Review – Features, Pricing and Alternatives first appeared on Android App Space.

The post Upcloud VPS Review – Features, Pricing and Alternatives appeared first on Android App Space.

]]>
/upcloud-vps-review-features-pricing-and-alternatives/feed/ 0
Software Dev. is a Racket (yes, an “old guy” rant) : softwaredevelopment /software-dev-is-a-racket-yes-an-old-guy-rant-softwaredevelopment/ /software-dev-is-a-racket-yes-an-old-guy-rant-softwaredevelopment/#respond Fri, 05 Nov 2021 16:05:02 +0000 /software-dev-is-a-racket-yes-an-old-guy-rant-softwaredevelopment/ I notice the amount of code and time to create small and medium-sized business software has been going noticeably up since the 1990’s, not down. It’s like Moore’s Law in reverse. CRUD principles haven’t changed that much, but stacks ignore YAGNI, DRY, and KISS and have made CRUD harder instead of easier. One spends too…

The post Software Dev. is a Racket (yes, an “old guy” rant) : softwaredevelopment first appeared on Android App Space.

The post Software Dev. is a Racket (yes, an “old guy” rant) : softwaredevelopment appeared first on Android App Space.

]]>

I notice the amount of code and time to create small and medium-sized business software has been going noticeably up since the 1990’s, not down. It’s like Moore’s Law in reverse. CRUD principles haven’t changed that much, but stacks ignore YAGNI, DRY, and KISS and have made CRUD harder instead of easier. One spends too much development time on tech minutia and battling UI’s instead of domain logic itself.

I will agree there’s more choice now, but the cost of choice seems huge. Most apps don’t need all the what-if’s the bloaters brag about. I doubt most biz owners would want to pay 3x more for all those what-if’s. (Nor am I sure it’s an either-or choice.)

Warren Buffett noticed the same about finance: it’s a fad machine that processes suckers. He got rich by letting his competitors waste into fads. He’s not afraid to say “no” to industry peer pressure or Fear-Of-Being-Left-Behind 🕰👹. The hucksters use the same techniques that trick 35% of the population into thinking the vaccine and election are rigged. Humans are suckers, and IT fad pushers know this.

The industry is pressured to sell books, new software versions, how-to videos, video ads, etc. Publishers of how-to content would lose their livelihood if new features or techniques were vetted in a more scientific way rather than “it’s what all the kool kids are doing!”. Here’s a partial list of resource-wasting fads:

  • Microservices are mostly a JSON-ized version of the XML-web-services-everywhere fad of the early 2000’s. It mostly failed for the same reasons Microservices often go sour. Microservices are a sub-fad of the “Netflixification” of our stacks. What works for a billion users (Netflix) is mega-overkill for 1,000 users. Bloating code with “Async/await” everywhere is also a symptom of this disease.

  • People started throwing out RDBMS for the “NoSql” movement in the early 2010’s, and when the systems matured, they realized they actually needed many features of traditional RDBMS after all. And RDBMS have since added more distributed features. (It was more about “DBA’s don’t let us move fast and brake things”, but when your biz matures, you do want to stop breaking things.)

  • Space-wasting UI’s optimized for fingers (mobile) instead of mouse. This results in more scrolling, paging, and more screens to get things done. It maybe okay for bedroom social media, but are crappy for in-office productivity where 90% still use mice. 🐭 Mice didn’t die, only productivity did. The web still can’t do real GUI’s for office work without bloated buggy JS libraries.

  • “Responsive design” turned the bicycle science of WYSIWYG into rocket science. Now you need to hire a dedicated UI expert to get decent UI’s. I’ve rarely seen a generalist master responsive. Most businesses I know only need about 10% of screens on mobile, so why drag down the other 90%? Because they were told real GUI’s are “obsolete”.

  • OOP was poorly understood yet was shoehorned into everything when it first came out because everyone feared being left off the OOP Train. It’s since been learned it’s not very good at domain modelling, which is where it was first pushed. Many companies were left with OOP spaghetti code. OOP has its place, but took a while to figure out where that is.

  • “Functional programming” keeps coming back in style every 25 years or so, looks great on paper, but fails on real teams. One problem is it’s harder to debug because it lacks intermediate “state” to x-ray via debuggers for clues.

Yes, I know there are exceptions, but in general these are what happened. [Edited.]

The post Software Dev. is a Racket (yes, an “old guy” rant) : softwaredevelopment first appeared on Android App Space.

The post Software Dev. is a Racket (yes, an “old guy” rant) : softwaredevelopment appeared first on Android App Space.

]]>
/software-dev-is-a-racket-yes-an-old-guy-rant-softwaredevelopment/feed/ 0
Deep linking in flutter | dynamic link /deep-linking-in-flutter-dynamic-link/ /deep-linking-in-flutter-dynamic-link/#respond Thu, 04 Nov 2021 19:00:00 +0000 https://androidappspace.com/deep-linking-in-flutter-dynamic-link/ Deep linking  : Deep linking concept in flutter app is explained in this part of the tutorial. When you try to open a URL i.e., the dynamic link you can open the app and navigate to that screen. Apps like amazon, flipkart can use these deep links to exactly move the user to the product…

The post Deep linking in flutter | dynamic link first appeared on Android App Space.

The post Deep linking in flutter | dynamic link appeared first on Android App Space.

]]>

Deep linking  :

Deep linking concept in flutter app is explained in this part of the tutorial. When you try to open a URL i.e., the dynamic link you can open the app and navigate to that screen.

Apps like amazon, flipkart can use these deep links to exactly move the user to the product page or the offers section based on the push notifications.

Deep linking also plays a key role in new app installations, where users are offered welcome bonus, referral bonus and so on.

In this tutorial we will make use of firebase for creating dynamic links follow video tutorial provided to understand how we can perform deep linking in flutter app.

 

 

pubspec.yaml :

Add get and firebase dynamic links to the pubspec

dependencies:
  flutter:
    sdk: flutter
  get:
  firebase_dynamic_links:

profile.dart :

Create a profile screen so that we can move to it after the Deep linking url is opened up.

import 'package:flutter/material.dart';

class Profile extends StatelessWidget 
  const Profile(Key? key) : super(key: key);

  @override
  Widget build(BuildContext context) 
    return MaterialApp(
      home: Scaffold(
        appBar: AppBar(title: Text("Profile Screen")),
        body: Center(
          child: Container(
            child: Text('Profile Screen'),
          ),
        ),
      )
    );
  

 

main.dart :

Initialize GetX to make use of navigation in your app. And also provide the routes so that we can navigate to that screen on link open.

GetMaterialApp(
  routes: 
    '/profile' : (BuildContext context) => Profile()
  ,

 

In init block we need to provide dynamic block link method so that it gets initialized as the app loads and can redirect to the profile screen.

 

Add a firebase dynamic links instance using which we can fetch the data from dynamic links and process further.

We have both success and failure blocks and can provide the code depending upon the situations.

 

FirebaseDynamicLinks.instance.onLink(
  onSuccess: (PendingDynamicLinkData? dynamicLink) async ,
  onError: (OnLinkErrorException e) async
);

 

Now let’s try to make use of the dynamic link and navigate to the next screen using GetX navigation.

if(deeplink!=null) 
  print("deeplink data " + deeplink.queryParameters.values.first);
  Get.toNamed('/profile');

 

On error we are printing error message onto the console.

onError: (OnLinkErrorException e) async
    print(e.message);

 

Providing the full code for deep linking integration.

import 'package:firebase_dynamic_links/firebase_dynamic_links.dart';
import 'package:flutter/material.dart';
import 'package:flutter_deep_linking/profile.dart';
import 'package:get/get.dart';

void main()
  runApp(MyApp());


class MyApp extends StatefulWidget 
  const MyApp(Key? key) : super(key: key);

  @override
  _MyAppState createState() => _MyAppState();


class _MyAppState extends State<MyApp> 

  @override
  void initState() 
    // TODO: implement initState
    super.initState();
    initDynamicLinks();
  

  @override
  Widget build(BuildContext context) 
    return GetMaterialApp(
      routes: 
        '/profile' : (BuildContext context) => Profile()
      ,
      home: Scaffold(
        appBar: AppBar(title: Text("DeepLink"),),
        body: Center(
          child: Container(
            child: Text('Deeplinking Page'),
          ),
        ),
      ),
    );
  


void initDynamicLinks() async
  FirebaseDynamicLinks.instance.onLink(
    onSuccess: (PendingDynamicLinkData? dynamicLink) async
      final Uri? deeplink = dynamicLink?.link;

      if(deeplink!=null)
        print("deeplink data "+deeplink.queryParameters.values.first);

        Get.toNamed('/profile');

      
    ,
    onError: (OnLinkErrorException e) async
      print(e.message);
  
  );




 

output :

The screen depicts the usage of flutter deeplinking.

Deep linking

 

If you have any query’s in this tutorial on Deep linking do let us know in the comment section below.If you like this tutorial do like and share us for more interesting tutorials.

The post Deep linking in flutter | dynamic link first appeared on Android App Space.

The post Deep linking in flutter | dynamic link appeared first on Android App Space.

]]>
/deep-linking-in-flutter-dynamic-link/feed/ 0
The App Store is the Games Store. Apple should recognize this. /the-app-store-is-the-games-store-apple-should-recognize-this/ /the-app-store-is-the-games-store-apple-should-recognize-this/#respond Mon, 25 Oct 2021 18:55:18 +0000 /the-app-store-is-the-games-store-apple-should-recognize-this/ One passage in the 185-page ruling issued last week in the Epic Games v. Apple lawsuit forces clarity on the convoluted patchwork of rules governing platform fee applicability by expressly acknowledging the beating heart of the App Store economy: Further, the evidence demonstrates that most App Store revenue is generated by mobile gaming apps, not…

The post The App Store is the Games Store. Apple should recognize this. first appeared on Android App Space.

The post The App Store is the Games Store. Apple should recognize this. appeared first on Android App Space.

]]>

One passage in the 185-page ruling issued last week in the Epic Games v. Apple lawsuit forces clarity on the convoluted patchwork of rules governing platform fee applicability by expressly acknowledging the beating heart of the App Store economy:

Further, the evidence demonstrates that most App Store revenue is generated by mobile gaming apps, not all apps. Thus, defining the market to focus on gaming apps is appropriate. Generally speaking, on a revenue basis, gaming apps account for approximately 70% of all App Store revenues. This 70% of revenue is generated by less than 10% of all App Store consumers. These gaming-app consumers are primarily making in-app purchases which is the focus of Epic Games’ claims. By contrast, over 80% of all consumer accounts generate virtually no revenue, as 80% of all apps on the App Store are free.

The App Store is a mobile games distribution business: the vast majority of App Store revenues are generated by in-app purchases from games because most other app categories are already exempted, effectively or explicitly, from paying a platform fee. These exemptions are applied by category:

  • Apps that deliver fulfillment non-digitally (Uber, Airbnb, Skyscanner, Doordash, etc.). These apps don’t pay a platform fee on revenues generated from within the app because the goods and services they provide to users are fulfilled non-digitally;
  • Reader apps (Netflix, Spotify, HBO Max, Youtube TV, etc.). These apps provide access to a previously-purchased or subscription-gated catalogue of content. While these apps can use Apple’s payments tools to charge users for subscriptions — and they must pay the App Store platform fee when they do so — the largest of these (eg. Netflix) don’t at all, and most at least offer a web-based payments system. Thus, this category generates some revenue to the App Store, but much of the revenue generated by this category is exempt from the App Store platform fee. Prior to Apple’s settlement with Japan’s FTC, these apps could not link to non-App Store payments processes from within their apps, but now they can, and so the contribution to App Store revenue from this category is likely to shrink further.

The categories that aren’t exempt from App Store fees are games and so-called “utility apps” like Tinder, Calm, etc. that 1) charge users for content and functionality that is fulfilled within the app and 2) feature content that is interactive and not previously purchased.

But, per the above statistics, Apple has mostly already scoped the applicability of its platform fee primarily to games by having exempted other categories. And within the gaming category, the distribution of spend is heavily skewed toward extreme spending behaviors. From the ruling:

Importantly, spending on the consumer side is also primarily concentrated on a narrow subset of consumers: namely, exorbitantly high spending gamers. In the third quarter of 2017, high spenders, accounting for less than half a percent of all Apple accounts, spent a “vast majority of their spend[] in games via IAP” and generated 53.7% of all App Store billings for the quarter, paying in excess of $450 each. In that same quarter, medium spenders ($15- $450/quarter) and low spenders (<$15/quarter), constituting 7.4% and 10.8% of all Apple accounts, accounted for 41.5% and 4.9% of all App Store billing, respectively.

This is not surprising to anyone who has worked in mobile gaming. In fact, this monetization reality is a design choice. As I write in Freemium Economics:

The second pattern is that very few users of freemium products ever monetize, or spend money on them. The low proportion of users who monetize in freemium products contributes to the necessity of large potential scale: a low percentage of monetizing users within a very large total user base might represent a respectable absolute number of people. This concept is referred to in this book as the 5% rule, or the understanding that no more than 5 percent of a freemium product’s user base can be expected to monetize…Thus the 5% rule is not, in fact, a rule; it is a design decision through which the developer embraces the practicalities of the freemium business model, which suggest that a small, dedicated minority of users can monetize to greater aggregate effect than can a larger population of users that has monetized universally through paid access. This design decision is an outgrowth of the freemium model scale requirement: the larger the total user base, the more meaningful will be the minority proportion of users who monetize.

Why does this matter? Because games is the category least likely to substantially benefit by re-routing revenues through alternative payments processors. Extreme spending behavior often involves large sequences of small, one-off purchases. The #1 Top Grossing iOS game in the US as of this writing is Roblox; the highest-priced IAP offered by Roblox costs $9.99 (note that some of these IAPs below represent recurring subscriptions).

Mobile games economies are predicted on low-friction purchase mechanics. Will users click out to a website to save money on a low-cost IAP relative to what that IAP would cost with a double-tap of the power button? How much of a discount would app developers need to offer in order to successfully incentivize that new behavior? Put differently: how much of the platform fee can games developers actually expect to recover with web-based payments processing?

As I argue in Is app store regulation too little, too late?, the additional friction inherent in forwarding a user to a web-based payment processor, especially if it requires the input of a credit card number, undercuts the opportunity presented by the prospect of web-based payments. This is true generally, but it’s acutely true for mobile games, and Apple should recognize this. Perhaps some games developers can successfully drive users to a web-based payments platform without having to give up the entirety of their platform fee savings, but it’s unlikely that they’ll achieve that for all or even a majority of revenues.

Apple can abandon its disorienting, unintuitive, and inordinately gerrymandered map of App Store categories and simply apply the same platform fee rules to all in-app purchases of digital goods, treating all categories of apps as essentially the same while allowing a link to off-App Store payments processors. The largest streaming services and “sharing economy” apps are already not paying Apple; some mix of subscription apps are paying Apple a platform fee on some portion of their revenues, and this portion will decrease with the introduction of a payments link, but these apps only drive 30% of revenue, anyway; and the proportion of mobile games revenues subject to the platform fee is unlikely to materially change.

The post The App Store is the Games Store. Apple should recognize this. first appeared on Android App Space.

The post The App Store is the Games Store. Apple should recognize this. appeared first on Android App Space.

]]>
/the-app-store-is-the-games-store-apple-should-recognize-this/feed/ 0
My favorite musical discoveries of 2020 /my-favorite-musical-discoveries-of-2020/ /my-favorite-musical-discoveries-of-2020/#respond Fri, 22 Oct 2021 16:05:03 +0000 /my-favorite-musical-discoveries-of-2020/ Music is one of my ongoing pleasures, but I don’t write about it much – as the old witticism says “writing about music is like dancing about architecture.” But I do find other people’s writing can lead me to some great music, and writing about my favorite albums I acquired this year is a way…

The post My favorite musical discoveries of 2020 first appeared on Android App Space.

The post My favorite musical discoveries of 2020 appeared first on Android App Space.

]]>

Music is one of my ongoing pleasures, but I don’t write about it much –
as the old witticism says “writing about music is like dancing about
architecture.” But I do find other people’s
writing can lead me to some great music, and writing about my favorite
albums I acquired this year is a way of paying
that debt forward.

I talk about “albums” because I’m still stuck in the mold of
buying albums. Back in my 20’s I started a habit of buying 4 CDs a month.
I’d listen to the new batch pretty heavily during that month, than shift
them to a rotation with all the rest of my accumulated albums once I got a
new batch. Given that was three decades ago, I’ve amassed a lot of music,
and still find this way of exploring the musical world to suit me.

Last year saw a significant shift in where I bought my music. For many
years I used the emusic service, which suited my 4-a-month buying habit
really well. But over recent years their catalog got more withered and I
eventually gave up my subscription. Now my first port of call for music is
Bandcamp. From what I can tell, the site passes
on a bigger share of my dollars to the artist than buying through someone
like Amazon. In addition most artists on the platform allow you to listen to
their music through the browser before buying, which allows me to sample a
new band that I’m unsure of. That way I can get a sense if I’ll like them
and form a good mix in my four albums batch.

My tastes are not mainstream, and I doubt I’d find any of these albums on
my car radio. I tend to focus on jazz and world/roots music, but it’s easier
to express what I like by going through my arbitrary top six of new-to-me
albums in 2020.

v2.0 by GoGo Penguin

sample track: Kamaloka

My biggest new discovery of recent years is the Manchester-based jazz
trio: GoGo Penguin (thanks Badri!). They play in a style of jazz that I’ve
been listening to a lot recently, a style that takes a lot of influence
from minimalist classical music, and EDM. There’s
a lot of emphasis on texture, rather than melody and soloing. GoGo Penguin
has a sound that’s very integrated, with lots of close interplay between
the instruments, rather than one soloing and the others
accompanying. If this kind of thing appeals to you, some similar albums I
got this year are from Majamisty TriO,
Aaron Parks, and EYOT

Hawk To The Hunting Gone by You Are Wolf

sample track: Three Ravens

You Are Wolf is an avant-garde take on English folk music. I’ve long
enjoyed British folk music, with women singers like Sandy Denny, June
Tabor, and Eddie Reader all having confirmed places on my virtual shelves.
You Are Wolf is a whole new take on the folk sound. Kerry Andrew’s voice
matches those remarkable voices I named earlier, but the group’s take on
the music is far more experimental. Three Ravens is a dark English folk ballad dating from at least the
seventeenth century. You Are Wolf reimagines this so that it sounds ahead
of its time in the twenty-first.

Bridges by Adam Baldych & Helge Lien Trio

sample track: Mosaic

The common take for jazz violin is the gypsy sound from Stefan
Grapelli. But the violin can offer much more to jazz that that, and the
Polish violinist Adam Balydych demonstrates this in collaboration with
the Norwegian Helge Lien Trio. I don’t usually pay very much attention
to record labels, but ACT has become an
exception as they have excellent taste in European jazz and have led me
to many favorite sounds in the last few years. A couple of other ACT
highlights for me this year were Vincent Peirani
& Emile Parisien, and Matthieu
Saglio. ACT just started putting their music out on Bandcamp,
which I hope will help me explore their catalog further.

Dirty Bourbon River Show describe themselves as “Circus Rock”, which is
a fair description of a sound that suggests a big tent from their home of
New Orleans. The compositions of singer/multi-instrumentalist Noah Adams
mix his gravelly vocals with bayou rhythms and the hyper-active sax of
Matt Thomas. I bought what was available of their physical CDs a few years
ago, but Bandcamp has given them the opportunity to make the rest of their
discography available.

Somi is an American singer-songwriter, who I’ve heard described as a
melding of Sade and Fela Kuti. In this live album she sings with the
Frankfurt Radio Big Band, and the combination of her vocals and big band
produce that kind of international cocktail that is as magical as it is
unusual. The arrangements have the band perfectly backing her songs
keeping her, and the distinctly African sound of guitarist Hervé Samb,
front and center.

Suite to be You and Me by I Think You’re Awesome

sample track: The Distance

This album is a collaboration between the Danish jazz group I Think
You’re Awesome and the Berlin-based Taïga String Quartet. A line-up
featuring Wurlitzer and Banjo is as quirky as I Think You’re Awesome’s
name, but blends well with the string quartet in a harmony inspired by the
marriage of their leader with the string quartet’s cellist. The resulting
album is uplifting and joyful, and I hope their relationship mirrors it.

While Bandcamp has been my primary channel for new music
this year, I do get some from other sources, particularly for more
mainstream labels that don’t have their artists on Bandcamp. This year I
finally got around to listening to Landfall: Laurie Anderson’s collaboration with the
Kronos Quartet. I’ve long liked Laurie Anderson’s music and this could be my
favorite album of hers. Some even-older music listening was triggered by
watching the recent film biography of Miles
Davis. I’d not listened to much of his later fusion work, but really enjoyed
finally getting into In a Silent Way and
A Tribute To Jack Johnson

I hope these suggestions have given you some new music to enjoy. If so, I
should mention that my best sources for new music in the last couple of
years has been Dave Sumner’s monthly columns on best jazz on Bandcamp Daily,
and the always enjoyable James Catchpole on the OK Jazz podcast.


The post My favorite musical discoveries of 2020 first appeared on Android App Space.

The post My favorite musical discoveries of 2020 appeared first on Android App Space.

]]>
/my-favorite-musical-discoveries-of-2020/feed/ 0
Android – Building a podcast app series: 1. mini in-app player /android-building-a-podcast-app-series-1-mini-in-app-player/ /android-building-a-podcast-app-series-1-mini-in-app-player/#respond Fri, 15 Oct 2021 18:42:42 +0000 /android-building-a-podcast-app-series-1-mini-in-app-player/ Start writing here…I am currently working on a project to rewrite an existing podcast app that was originally built with xamarin into a fully native Android app. The most important feature of any podcast app is the player, and specifically, the in-app player that has to: be visible on all screens have collapsed and expanded…

The post Android – Building a podcast app series: 1. mini in-app player first appeared on Android App Space.

The post Android – Building a podcast app series: 1. mini in-app player appeared first on Android App Space.

]]>

Start writing here…I am currently working on a project to rewrite an existing podcast app that was originally built with xamarin into a fully native Android app. The most important feature of any podcast app is the player, and specifically, the in-app player that has to:

  • be visible on all screens
  • have collapsed and expanded view

Let’s clarify the second point, the in-app player should have two modes, collapsed mode where the player’s view is small and should sit below the main content. The second mode is expanded mode, where the player’s view should occupy the whole screen. Ideally, there should be an animation when the player is transitioning between those two states. Let’s talk about solving problem number 1.

If we use the navigation library and single activity pattern from AAC (android architecture components), we can easily make our in-app player visible on all screens by restricting the navigation host fragment to be above the in-app player’s view. Since the navigation framework loads / unloads all views inside the navigation host fragment, we can make all views in our app span just up to the in-app player. If the player is hidden, all views automatically span over it, otherwise, they will span to just above the player.

A nice side effect of using the single activity pattern is the player itself will be configured from one place, main activity, and in combination with a view model, we can implement the player without violating the DRY principle. Now let’s talk about implementing the second feature.
Layout behavior

Ever since the coordinator layout came out, Android became a lot more flexible in terms of animations and interaction between views. In this specific use case all I had to do was to write a normal constraint layout and add one line to it:

<androidx.constraintlayout.widget.ConstraintLayout
....
app:layout_behavior="com.google.android.material.bottomsheet.BottomSheetBehavior">

That was enough to have a view that can sit on the bottom of the screen, be expanded and animate those state changes out of the box. Pretty awesome if you ask me. So, how does the final layout look like?

<CoordinatorLayout>
   <MainContent/>
   <InAppPlayer/>
</CoordinatorLayout>

Of course, nothing is this simple, is it? 😃. But, what if our app has tabs and a BottomNavigationView? Our in-app player should, of course, sit above the BottomNavigationView but below the main content. One solution to this edge case is to position the main content above the bottom tabs’ view using constraint layout.

DRY principle and ViewModel

Since the player exists across multiple screens (fragments), we need a view model to group together all the player features in one place and just reuse the view model across all views. What better way to implement this than to use AAC ViewModel class. We can easily share this view model across fragments and have a player UI in sync at all times. PlayerViewModel can get data from a PodcastRepository class which will make network calls or read from the local database.

Please refer to this GitHub link for the code.

Stay tuned for the next article in this series!

The post Android – Building a podcast app series: 1. mini in-app player first appeared on Android App Space.

The post Android – Building a podcast app series: 1. mini in-app player appeared first on Android App Space.

]]>
/android-building-a-podcast-app-series-1-mini-in-app-player/feed/ 0
Live Templates on Android Studio and IntellliJ to improve your productivity /live-templates-on-android-studio-and-intelllij-to-improve-your-productivity/ /live-templates-on-android-studio-and-intelllij-to-improve-your-productivity/#respond Mon, 11 Oct 2021 16:05:09 +0000 /live-templates-on-android-studio-and-intelllij-to-improve-your-productivity/ Live Templates are a way to avoid repetitive code in development environments like Android Studio or IntelliJ. When we are writing code, there are times when certain repetitive code can happen, and there is no way to encapsulate it. For these occasions, Live Templates are an ideal solution that can save you a lot of…

The post Live Templates on Android Studio and IntellliJ to improve your productivity first appeared on Android App Space.

The post Live Templates on Android Studio and IntellliJ to improve your productivity appeared first on Android App Space.

]]>

Live Templates are a way to avoid repetitive code in development environments like Android Studio or IntelliJ.

When we are writing code, there are times when certain repetitive code can happen, and there is no way to encapsulate it.

For these occasions, Live Templates are an ideal solution that can save you a lot of time.

All you need is a template and a name, and with it you can quickly add that custom code for the particular case.

I will give you the example that you’ll find a lot if you use the Architecture Components, and in particular the MVVM pattern for Android.

Creating Live Templates for LiveData

It happens that when we want to use a LiveData in a ViewModel, we usually want to have a modifiable component in order to update its value.

But we are not interested that this value can be modified from the outside. What is recommended is to write a code like this:

private val _message = MutableLiveData<String>()
val message: LiveData<String> get() = _message

In this way, we protect that data from being modified from another class that has access to our state.

Check my free guide to create your first project in 15 minutes!

But if you look, there is a lot of repetitive code that we don’t want to repeat every time.

What can we do with LiveData? Something like this:

If you look, by writing the name of the template and selecting it in the suggestions, we already auto-fill most of the code.

But then it also lets us fill in a couple of gaps: the variable name and the and type. As it is used in both lines, when we write it in one it fills in the next.

How do we do this?

1. Create a new Live Template

To do this go to:

  • Settings (Preferences on Mac)
  • Editor
  • Live Templates
  • Click on “+”

2. Choose the abbreviation

Choose the abbreviation you want. It will appear in the autocomplete when you write it.

You can also put a description to remember what’s it used for:

3. Write the text of the Live Template

Here you will write the code that you don’t want to repeat every time. Write $x$ in the place where you want a variable.

Autofill will stop at that point for you to fill out. If you put the same variable name in other places, its value will be repeated in those places.

In our case we would write:

private val _$VAR$ = MutableLiveData<$TYPE$>()
val $VAR$: LiveData<$TYPE$> get() = _$VAR$

4. Choose the context

This indicates the situations in which this live template will be suggested.

In our case we want it for Kotlin code:

And you can already use it!

Live Templates: Conclusion

As you can see, it is very easy to define these live templates, and they can save you from repeating a lot of unnecessary code.

So I encourage you to make your own and share them in the comments section.

If you want a cheatsheet, you can find one of live templates on my Instagram profile.

The post Live Templates on Android Studio and IntellliJ to improve your productivity first appeared on Android App Space.

The post Live Templates on Android Studio and IntellliJ to improve your productivity appeared first on Android App Space.

]]>
/live-templates-on-android-studio-and-intelllij-to-improve-your-productivity/feed/ 0