[prev in list] [next in list] [prev in thread] [next in thread] 

List:       kde-bugs-dist
Subject:    [palapeli] [Bug 337972] New: IDEA: Non-uniform piece size based on image detail
From:       Matthew Woehlke <mw_triad () users ! sourceforge ! net>
Date:       2014-08-02 0:36:42
Message-ID: bug-337972-17878 () http ! bugs ! kde ! org/
[Download RAW message or body]

https://bugs.kde.org/show_bug.cgi?id=337972

            Bug ID: 337972
           Summary: IDEA: Non-uniform piece size based on image detail
           Product: palapeli
           Version: unspecified
          Platform: Fedora RPMs
                OS: Linux
            Status: UNCONFIRMED
          Severity: wishlist
          Priority: NOR
         Component: slicers
          Assignee: majewsky@gmx.net
          Reporter: mw_triad@users.sourceforge.net
                CC: kde-games-bugs@kde.org

This is going to be a little bit odd, but the story is interesting; please bear
with me.

I had a dream (not the M. L. King kind, but the waking-from-sleep kind) this
morning, in which there was a jigsaw puzzle. This puzzle was unusual for having
pieces of widely varying sizes, based on the local level of detail.

While still half asleep, I a) got the idea that it would be cool if Palapeli
could do this, and b) came up with an algorithm.

The first step is to compute a Feature Density Estimate (FDE) of the input
image, which should be a single-valued function over two dimensions that is
either continuous or with discrete sampling of a comparable order to the input
data (a grayscale image of equal dimensions will do). There are many possible
algorithms for this (edge detection, etc.). I'm thinking, for each pixel,
compute the average difference per channel with neighboring pixels (either the
uniform-weighted average of directly adjacent neighbors, ignoring diagonals, or
a weighted average giving diagonals, say, sqrt(2)/2 times the weight of direct
neighbors), and take the average (probably weighted by relative luminosity, a
la qGray) of the per-channel results. Then apply a smoothing filter (probably
Gaussian blur) to the result. (One reason to propose Gaussian blur is that it
seems to make sense to choose the radius based on the image size and desired
piece count, probably as some fraction of the input image area divided by
desired piece count.)

Now we need some random points. Obtain these using Poisson Disk Sampling (well
documented), using the FDE as a density function (black = lower density, white
= higher density). This will give us some points to play with, that are
clustered in high-detail areas and sparse in low-detail areas. Note that at
this point we want to have quite a few more points than the number of desired
pieces.

Here's the fun part. "Reduce these". I was still half-asleep at this time, but
I think I've figured out how to practically do what my subconscious was
visualizing without being bothered by such details.

(Of course, the "simple" solution is to just randomly select a number of these
points equal to the desired piece count and apply Voroni to get our base grid.)

Have a scanning direction option in the slicer (horizontal or vertical), and
also a maximum distance option. Call the current set of points the "working
set". Select a point near the top left, and draw a square centered about that
point such that the square has an edge length not more than twice the maximum
distance, and containing not more than some number of points (not sure on the
algorithm to choose this number; derive by experimentation?). Move the initial
point to the "final set" and remove all points within this square from the
working set. Choose a "best candidate" point to the left or bottom (depending
on search direction) of the square, where "best" is weighted between
single-axis distance from the edge and single-axis distance from the edge
bisector (either derive the relative weight by experimentation or have it be a
parameter). Build a new square with this point. Repeat until a row/column has
been processed. Then start over with the new top-left-most point in the working
set and repeat until no points remain in the working set. Take the Voroni of
the final set as the slicing grid. The objective is to bias the slicing to be
"grid-like" while still allowing for size variations. (It might be desirable to
have a third bias option, "none", to skip this and use instead the simple
technique mentioned previously.)

There are two other desired refinements (actually, these would apply to the
existing irregular slicer also, if it doesn't use them already). First, for
each cell vertex, find the largest angle Ī±, and compute a weighted random
chance (Ī±=180 ° ā†’ pā‰ˆ0.9, Ī±=90 ° ā†’ pā‰ˆ0.3) of forming the edges of said angle to be
G2 continuous at that point. Second, for each edge, compute a weighted random
chance (A=1.6 ā†’ p=1, A=2.5 ā†’ pā‰ˆ0.1, with A = the ratio between the edge length
and the nominal plug (minimum?) width) of omitting a plug for that edge.

-- 
You are receiving this mail because:
You are watching all bug changes.=
[prev in list] [next in list] [prev in thread] [next in thread] 

Configure | About | News | Add a list | Sponsored by KoreLogic