Become a fan of Slashdot on Facebook


Forgot your password?
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. ×

Comment TVs, no. Monitors, yes. (Score 1) 166

The manufacturers can't even agree on which curvature is better, concave or convex (although most are concave now).
If you live on your own and watching TV is purely a solitary experience, or maybe with one other person, a curved TV can be OK, but if you're getting a few friends over to watch a movie or a sports game, only the person in the centre will have a decent view.

For a monitor, on the other hand, you're sitting on your lonesome, right in the sweet spot and a curved monitor can be great for some tasks.

Comment Re:WTF does it do? (Score 3, Informative) 48

OK, now it's starting to make more sense looking at the use cases

Here is a description of a few of the popular use cases for Apache Kafka. For an overview of a number of these areas in action, see this blog post.

Kafka works well as a replacement for a more traditional message broker. Message brokers are used for a variety of reasons (to decouple processing from data producers, to buffer unprocessed messages, etc). In comparison to most messaging systems Kafka has better throughput, built-in partitioning, replication, and fault-tolerance which makes it a good solution for large scale message processing applications.
In our experience messaging uses are often comparatively low-throughput, but may require low end-to-end latency and often depend on the strong durability guarantees Kafka provides.

In this domain Kafka is comparable to traditional messaging systems such as ActiveMQ or RabbitMQ.

Website Activity Tracking
The original use case for Kafka was to be able to rebuild a user activity tracking pipeline as a set of real-time publish-subscribe feeds. This means site activity (page views, searches, or other actions users may take) is published to central topics with one topic per activity type. These feeds are available for subscription for a range of use cases including real-time processing, real-time monitoring, and loading into Hadoop or offline data warehousing systems for offline processing and reporting.
Activity tracking is often very high volume as many activity messages are generated for each user page view.

Kafka is often used for operational monitoring data. This involves aggregating statistics from distributed applications to produce centralized feeds of operational data.

Log Aggregation
Many people use Kafka as a replacement for a log aggregation solution. Log aggregation typically collects physical log files off servers and puts them in a central place (a file server or HDFS perhaps) for processing. Kafka abstracts away the details of files and gives a cleaner abstraction of log or event data as a stream of messages. This allows for lower-latency processing and easier support for multiple data sources and distributed data consumption. In comparison to log-centric systems like Scribe or Flume, Kafka offers equally good performance, stronger durability guarantees due to replication, and much lower end-to-end latency.

Stream Processing
Many users of Kafka process data in processing pipelines consisting of multiple stages, where raw input data is consumed from Kafka topics and then aggregated, enriched, or otherwise transformed into new topics for further consumption or follow-up processing. For example, a processing pipeline for recommending news articles might crawl article content from RSS feeds and publish it to an "articles" topic; further processing might normalize or deduplicate this content and published the cleansed article content to a new topic; a final processing stage might attempt to recommend this content to users. Such processing pipelines create graphs of real-time data flows based on the individual topics. Starting in, a light-weight but powerful stream processing library called Kafka Streams is available in Apache Kafka to perform such data processing as described above. Apart from Kafka Streams, alternative open source stream processing tools include Apache Storm and Apache Samza.

Event Sourcing
Event sourcing is a style of application design where state changes are logged as a time-ordered sequence of records. Kafka's support for very large stored log data makes it an excellent backend for an application built in this style.

Commit Log
Kafka can serve as a kind of external commit-log for a distributed system. The log helps replicate data between nodes and acts as a re-syncing mechanism for failed nodes to restore their data. The log compaction feature in Kafka helps support this usage. In this usage Kafka is similar to Apache BookKeeper project.

Comment WTF does it do? (Score 1) 48

I've got no idea what Kafka does, and the summary really doesn't tell you much at all. I was about to put in a helpful post saying what it is, but even after visiting their home page I've still got no idea.

Apparently Kafka is used for building real-time data pipelines and streaming apps. It is horizontally scalable, fault-tolerant, wicked fast, and runs in production in thousands of companies.

How about the Intro
We think of a streaming platform as having three key capabilities:
It lets you publish and subscribe to streams of records. In this respect it is similar to a message queue or enterprise messaging system.
      It lets you store streams of records in a fault-tolerant way.
      It lets you process streams of records as they occur.

What is Kafka good for?
It gets used for two broad classes of application:
      Building real-time streaming data pipelines that reliably get data between systems or applications
      Building real-time streaming applications that transform or react to the streams of data

OK, I still am not really sure what it does.

Comment Re:battery life a braindead argument (Score 1) 300

I have a 2013 Mac Pro and a new 2016 MacBook Pro 13".
Whilst multi-thread performance is a different matter altogether, single core performance is pretty much on par (with a slight edge to the laptop) when comparing the two machines. The vast majority of software I run is single threaded, as I don't do video editing, 3D or gaming.

This is a Intel Xeon E5-1620 quad-core versus an i7-6567U

Power consumption is 130W to the Xeon versus 28W to the i7.

Comment A possible solution? (Score 4, Interesting) 221

A possible solution is to not ban sales to bots per-se, but instead verify that the identity of the person redeeming the ticket at the door is the same as the person who purchased the ticket (via verifying CC details, or even something as basic as their name).

If tickets have conditions on them that prevent their usage by anyone other than the person who originally bought them, then there can be no market for resold tickets. Let the scalpers buy as many tickets as they want, but eliminate the market for them to be resold.

Ticket Australia now state as part of their conditions of sale "This ticket may not, without the prior written consent of Ticketek or the Seller, be resold at a premium or used for advertising, promotion or other commercial purposes (including competitions and trade promotions) or to enhance the demand for other goods or services. If a ticket is sold or used in breach of this condition, the bearer of the ticket will be refused admission."

If you knowingly purchase a scalped ticket, you're taking a huge risk that you won't get in to the event.

Comment Re:How does this work? (Score 2) 368

With my car, once it's been locked with the button on the key fob, after a certain amount of time, it deadlocks the doors - they can not be opened from the inside or outside without being unlocked. The unlock button on the driver's door will no longer function either after the car has been locked from the fob.

This means I could, if I wanted to, lock the car with the windows partially down and after a minute or so the car would be deadlocked - even if someone reached in to open the door, they would be unable to.

Comment Apple are doing what they have done every year... (Score 2) 142

Apple are doing what they have done every single year - retiring old models from their supported lineup. Film at 11.

Every year, a range of Macs pass through the range of support status from "Supported" to "Vintage" to "Obsolete"

Vintage products are those that have not been manufactured for more than 5 and less than 7 years ago. Apple has generally discontinued hardware service for vintage products in most regions other than the state of California and Turkey.
Obsolete products are those that were discontinued more than 7 years ago. Apple has discontinued all hardware service for obsolete products with no exceptions. Service providers cannot order parts for obsolete products through Apple.

Slashdot Top Deals

The next person to mention spaghetti stacks to me is going to have his head knocked off. -- Bill Conrad